Sample records for earthquake engineering network

  1. Important Earthquake Engineering Resources

    Science.gov Websites

    PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering

  2. Introduction: seismology and earthquake engineering in Mexico and Central and South America.

    USGS Publications Warehouse

    Espinosa, A.F.

    1982-01-01

    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  3. Centrality in earthquake multiplex networks

    NASA Astrophysics Data System (ADS)

    Lotfi, Nastaran; Darooneh, Amir Hossein; Rodrigues, Francisco A.

    2018-06-01

    Seismic time series has been mapped as a complex network, where a geographical region is divided into square cells that represent the nodes and connections are defined according to the sequence of earthquakes. In this paper, we map a seismic time series to a temporal network, described by a multiplex network, and characterize the evolution of the network structure in terms of the eigenvector centrality measure. We generalize previous works that considered the single layer representation of earthquake networks. Our results suggest that the multiplex representation captures better earthquake activity than methods based on single layer networks. We also verify that the regions with highest seismological activities in Iran and California can be identified from the network centrality analysis. The temporal modeling of seismic data provided here may open new possibilities for a better comprehension of the physics of earthquakes.

  4. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  5. MCEER, from Earthquake Engineering to Extreme Events | Home Page

    Science.gov Websites

    Center Report Series Education Education Home Bridge Engineering Guest Speaker Series Connecte²d Teaching CSEE Graduate Student Poster Competition Earthquake Engineering Education in Haiti Earthquakes : FAQ's Engineering Seminar Series K-12 STEM Education National Engineers Week Research Experiences for

  6. Real-time neural network earthquake profile predictor

    DOEpatents

    Leach, R.R.; Dowla, F.U.

    1996-02-06

    A neural network has been developed that uses first-arrival energy to predict the characteristics of impending earthquake seismograph signals. The propagation of ground motion energy through the earth is a highly nonlinear function. This is due to different forms of ground motion as well as to changes in the elastic properties of the media throughout the propagation path. The neural network is trained using seismogram data from earthquakes. Presented with a previously unseen earthquake, the neural network produces a profile of the complete earthquake signal using data from the first seconds of the signal. This offers a significant advance in the real-time monitoring, warning, and subsequent hazard minimization of catastrophic ground motion. 17 figs.

  7. Real-time neural network earthquake profile predictor

    DOEpatents

    Leach, Richard R.; Dowla, Farid U.

    1996-01-01

    A neural network has been developed that uses first-arrival energy to predict the characteristics of impending earthquake seismograph signals. The propagation of ground motion energy through the earth is a highly nonlinear function. This is due to different forms of ground motion as well as to changes in the elastic properties of the media throughout the propagation path. The neural network is trained using seismogram data from earthquakes. Presented with a previously unseen earthquake, the neural network produces a profile of the complete earthquake signal using data from the first seconds of the signal. This offers a significant advance in the real-time monitoring, warning, and subsequent hazard minimization of catastrophic ground motion.

  8. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  9. Earthquake Complex Network Analysis Before and After the Mw 8.2 Earthquake in Iquique, Chile

    NASA Astrophysics Data System (ADS)

    Pasten, D.

    2017-12-01

    The earthquake complex networks have shown that they are abble to find specific features in seismic data set. In space, this networkshave shown a scale-free behavior for the probability distribution of connectivity, in directed networks and theyhave shown a small-world behavior, for the undirected networks.In this work, we present an earthquake complex network analysis for the large earthquake Mw 8.2 in the north ofChile (near to Iquique) in April, 2014. An earthquake complex network is made dividing the three dimensional space intocubic cells, if one of this cells contain an hypocenter, we name this cell like a node. The connections between nodes aregenerated in time. We follow the time sequence of seismic events and we are making the connections betweennodes. Now, we have two different networks: a directed and an undirected network. Thedirected network takes in consideration the time-direction of the connections, that is very important for the connectivityof the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out ofthe node i plus the self-connections (if two seismic events occurred successive in time in the same cubic cell, we havea self-connection). The undirected network is made removing the direction of the connections and the self-connectionsfrom the directed network. For undirected networks, we are considering only if two nodes are or not connected.We have built a directed complex network and an undirected complex network, before and after the large earthquake in Iquique. We have used magnitudes greater than Mw = 1.0 and Mw = 3.0. We found that this method can recognize the influence of thissmall seismic events in the behavior of the network and we found that the size of the cell used to build the network isanother important factor to recognize the influence of the large earthquake in this complex system. This method alsoshows a difference in the values of the critical exponent γ (for the probability

  10. Preferential attachment in evolutionary earthquake networks

    NASA Astrophysics Data System (ADS)

    Rezaei, Soghra; Moghaddasi, Hanieh; Darooneh, Amir Hossein

    2018-04-01

    Earthquakes as spatio-temporal complex systems have been recently studied using complex network theory. Seismic networks are dynamical networks due to addition of new seismic events over time leading to establishing new nodes and links to the network. Here we have constructed Iran and Italy seismic networks based on Hybrid Model and testified the preferential attachment hypothesis for the connection of new nodes which states that it is more probable for newly added nodes to join the highly connected nodes comparing to the less connected ones. We showed that the preferential attachment is present in the case of earthquakes network and the attachment rate has a linear relationship with node degree. We have also found the seismic passive points, the most probable points to be influenced by other seismic places, using their preferential attachment values.

  11. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  12. Microearthquake networks and earthquake prediction

    USGS Publications Warehouse

    Lee, W.H.K.; Steward, S. W.

    1979-01-01

    A microearthquake network is a group of highly sensitive seismographic stations designed primarily to record local earthquakes of magnitudes less than 3. Depending on the application, a microearthquake network will consist of several stations or as many as a few hundred . They are usually classified as either permanent or temporary. In a permanent network, the seismic signal from each is telemetered to a central recording site to cut down on the operating costs and to allow more efficient and up-to-date processing of the data. However, telemetering can restrict the location sites because of the line-of-site requirement for radio transmission or the need for telephone lines. Temporary networks are designed to be extremely portable and completely self-contained so that they can be very quickly deployed. They are most valuable for recording aftershocks of a major earthquake or for studies in remote areas.  

  13. EU H2020 SERA: Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Saleh, Kauzar; SERA Consortium, the

    2017-04-01

    SERA - Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe - is a new infrastructure project awarded in the last Horizon 2020 call for Integrating Activities for Advanced Communities (INFRAIA-01-2016-2017). Building up on precursor projects like NERA, SHARE, NERIES, SERIES, etc., SERA is expected to contribute significantly to the access of data, services and research infrastructures, and to develop innovative solutions in seismology and earthquake engineering, with the overall objective of reducing the exposure to risks associated to natural and anthropogenic earthquakes. For instance, SERA will revise the European Seismic Hazard reference model for input in the current revision of the Eurocode 8 on Seismic Design of Buildings; we also foresee to develop the first comprehensive framework for seismic risk modeling at European scale, and to develop new standards for future experimental observations and instruments for earthquake engineering and seismology. To that aim, SERA is engaging 31 institutions across Europe with leading expertise in the operation of research facilities, monitoring infrastructures, data repositories and experimental facilities in the fields of seismology, anthropogenic hazards and earthquake engineering. SERA comprises 26 activities, including 5 Networking Activities (NA) to improve the availability and access of data through enhanced community coordination and pooling of resources, 6 Joint Research Activities (JRA) aimed at creating new European standards for the optimal use of the data collected by the European infrastructures, Virtual Access (VA) to the 5 main European services for seismology and engineering seismology, and Trans-national Access (TA) to 10 high-class experimental facilities for earthquake engineering and seismology in Europe. In fact, around 50% of the SERA resources will be dedicated to virtual and transnational access. SERA and EPOS (European Platform Observing System, a European Research

  14. Earthquake correlations and networks: A comparative study

    NASA Astrophysics Data System (ADS)

    Krishna Mohan, T. R.; Revathi, P. G.

    2011-04-01

    We quantify the correlation between earthquakes and use the same to extract causally connected earthquake pairs. Our correlation metric is a variation on the one introduced by Baiesi and Paczuski [M. Baiesi and M. Paczuski, Phys. Rev. E EULEEJ1539-375510.1103/PhysRevE.69.06610669, 066106 (2004)]. A network of earthquakes is then constructed from the time-ordered catalog and with links between the more correlated ones. A list of recurrences to each of the earthquakes is identified employing correlation thresholds to demarcate the most meaningful ones in each cluster. Data pertaining to three different seismic regions (viz., California, Japan, and the Himalayas) are comparatively analyzed using such a network model. The distribution of recurrence lengths and recurrence times are two of the key features analyzed to draw conclusions about the universal aspects of such a network model. We find that the unimodal feature of recurrence length distribution, which helps to associate typical rupture lengths with different magnitude earthquakes, is robust across the different seismic regions. The out-degree of the networks shows a hub structure rooted on the large magnitude earthquakes. In-degree distribution is seen to be dependent on the density of events in the neighborhood. Power laws, with two regimes having different exponents, are obtained with recurrence time distribution. The first regime confirms the Omori law for aftershocks while the second regime, with a faster falloff for the larger recurrence times, establishes that pure spatial recurrences also follow a power-law distribution. The crossover to the second power-law regime can be taken to be signaling the end of the aftershock regime in an objective fashion.

  15. Revolutionising engineering education in the Middle East region to promote earthquake-disaster mitigation

    NASA Astrophysics Data System (ADS)

    Baytiyeh, Hoda; Naja, Mohamad K.

    2014-09-01

    Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enrol on engineering courses through lenient admission policies that do not compromise academic standards. This strategy has generated an influx of students who must be carefully educated to enhance their professional knowledge and social capital to assist in future earthquake-disaster risk-reduction efforts. However, the majority of Middle Eastern engineering students are unaware of the valuable acquired engineering skills and knowledge in building the resilience of their communities to earthquake disasters. As the majority of the countries in the Middle East are exposed to seismic hazards and are vulnerable to destructive earthquakes, engineers have become indispensable assets and the first line of defence against earthquake threats. This article highlights the contributions of some of the engineering innovations in advancing technologies and techniques for effective disaster mitigation and it calls for the incorporation of earthquake-disaster-mitigation education into academic engineering programmes in the Eastern Mediterranean region.

  16. Non-universal critical exponents in earthquake complex networks

    NASA Astrophysics Data System (ADS)

    Pastén, Denisse; Torres, Felipe; Toledo, Benjamín A.; Muñoz, Víctor; Rogan, José; Valdivia, Juan Alejandro

    2018-02-01

    The problem of universality of critical exponents in complex networks is studied based on networks built from seismic data sets. Using two data sets corresponding to Chilean seismicity (northern zone, including the 2014 Mw = 8 . 2 earthquake in Iquique; and central zone without major earthquakes), directed networks for each set are constructed. Connectivity and betweenness centrality distributions are calculated and found to be scale-free, with respective exponents γ and δ. The expected relation between both characteristic exponents, δ >(γ + 1) / 2, is verified for both data sets. However, unlike the expectation for certain scale-free analytical complex networks, the value of δ is found to be non-universal.

  17. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  18. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  19. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  20. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  1. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  2. The Engineering Strong Ground Motion Network of the National Autonomous University of Mexico

    NASA Astrophysics Data System (ADS)

    Velasco Miranda, J. M.; Ramirez-Guzman, L.; Aguilar Calderon, L. A.; Almora Mata, D.; Ayala Hernandez, M.; Castro Parra, G.; Molina Avila, I.; Mora, A.; Torres Noguez, M.; Vazquez Larquet, R.

    2014-12-01

    The coverage, design, operation and monitoring capabilities of the strong ground motion program at the Institute of Engineering (IE) of the National Autonomous University of Mexico (UNAM) is presented. Started in 1952, the seismic instrumentation intended initially to bolster earthquake engineering projects in Mexico City has evolved into the largest strong ground motion monitoring system in the region. Today, it provides information not only to engineering projects, but also to the near real-time risk mitigation systems of the country, and enhances the general understanding of the effects and causes of earthquakes in Mexico. The IE network includes more than 100 free-field stations and several buildings, covering the largest urban centers and zones of significant seismicity in Central Mexico. Of those stations, approximately one-fourth send the observed acceleration to a processing center in Mexico City continuously, and the rest require either periodic visits for the manual recovery of the data or remote interrogation, for later processing and cataloging. In this research, we document the procedures and telecommunications systems used systematically to recover information. Additionally, we analyze the spatial distribution of the free-field accelerographs, the quality of the instrumentation, and the recorded ground motions. The evaluation criteria are based on the: 1) uncertainty in the generation of ground motion parameter maps due to the spatial distribution of the stations, 2) potential of the array to provide localization and magnitude estimates for earthquakes with magnitudes greater than Mw 5, and 3) adequacy of the network for the development of Ground Motion Prediction Equations due to intra-plate and intra-slab earthquakes. We conclude that the monitoring system requires a new redistribution, additional stations, and a substantial improvement in the instrumentation and telecommunications. Finally, we present an integral plan to improve the current network

  3. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    NASA Astrophysics Data System (ADS)

    D'Alessandro, A.; Luzio, D.; D'Anna, G.

    2014-09-01

    In this paper, we introduce a project for the realization of the first European real-time urban seismic network based on Micro Electro-Mechanical Systems (MEMS) technology. MEMS accelerometers are a highly enabling technology, and nowadays, the sensitivity and the dynamic range of these sensors are such as to allow the recording of earthquakes of moderate magnitude even at a distance of several tens of kilometers. Moreover, thanks to their low cost and smaller size, MEMS accelerometers can be easily installed in urban areas in order to achieve an urban seismic network constituted by high density of observation points. The network is being implemented in the Acireale Municipality (Sicily, Italy), an area among those with the highest hazard, vulnerability and exposure to the earthquake of the Italian territory. The main objective of the implemented urban network will be to achieve an effective system for post-earthquake rapid disaster assessment. The earthquake recorded, also that with moderate magnitude will be used for the effective seismic microzonation of the area covered by the network. The implemented system will be also used to realize a site-specific earthquakes early warning system.

  4. Research in seismology and earthquake engineering in Venezuela

    USGS Publications Warehouse

    Urbina, L.; Grases, J.

    1983-01-01

    After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article.   

  5. Study of earthquakes using a borehole seismic network at Koyna, India

    NASA Astrophysics Data System (ADS)

    Gupta, Harsh; Satyanarayana, Hari VS; Shashidhar, Dodla; Mallika, Kothamasu; Ranjan Mahato, Chitta; Shankar Maity, Bhavani

    2017-04-01

    Koyna, located near the west coast of India, is a classical site of artificial water reservoir triggered earthquakes. Triggered earthquakes started soon after the impoundment of the Koyna Dam in 1962. The activity has continued till now including the largest triggered earthquake of M 6.3 in 1967; 22 earthquakes of M ≥ 5 and several thousands smaller earthquakes. The latest significant earthquake of ML 3.7 occurred on 24th November 2016. In spite of having a network of 23 broad band 3-component seismic stations in the near vicinity of the Koyna earthquake zone, locations of earthquakes had errors of 1 km. The main reason was the presence of 1 km thick very heterogeneous Deccan Traps cover that introduced noise and locations could not be improved. To improve the accuracy of location of earthquakes, a unique network of eight borehole seismic stations surrounding the seismicity was designed. Six of these have been installed at depths varying from 981 m to 1522 m during 2015 and 2016, well below the Deccan Traps cover. During 2016 a total of 2100 earthquakes were located. There has been a significant improvement in the location of earthquakes and the absolute errors of location have come down to ± 300 m. All earthquakes of ML ≥ 0.5 are now located, compared to ML ≥1.0 earlier. Based on seismicity and logistics, a block of 2 km x 2 km area has been chosen for the 3 km deep pilot borehole. The installation of the borehole seismic network has further elucidated the correspondence between rate of water loading/unloading the reservoir and triggered seismicity.

  6. MyShake: A smartphone seismic network for earthquake early warning and beyond.

    PubMed

    Kong, Qingkai; Allen, Richard M; Schreier, Louis; Kwon, Young-Woo

    2016-02-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics.

  7. Urban MEMS based seismic network for post-earthquakes rapid disaster assessment

    NASA Astrophysics Data System (ADS)

    D'Alessandro, Antonino; Luzio, Dario; D'Anna, Giuseppe

    2014-05-01

    Life losses following disastrous earthquake depends mainly by the building vulnerability, intensity of shaking and timeliness of rescue operations. In recent decades, the increase in population and industrial density has significantly increased the exposure to earthquakes of urban areas. The potential impact of a strong earthquake on a town center can be reduced by timely and correct actions of the emergency management centers. A real time urban seismic network can drastically reduce casualties immediately following a strong earthquake, by timely providing information about the distribution of the ground shaking level. Emergency management centers, with functions in the immediate post-earthquake period, could be use this information to allocate and prioritize resources to minimize loss of human life. However, due to the high charges of the seismological instrumentation, the realization of an urban seismic network, which may allow reducing the rate of fatalities, has not been achieved. Recent technological developments in MEMS (Micro Electro-Mechanical Systems) technology could allow today the realization of a high-density urban seismic network for post-earthquakes rapid disaster assessment, suitable for the earthquake effects mitigation. In the 1990s, MEMS accelerometers revolutionized the automotive-airbag system industry and are today widely used in laptops, games controllers and mobile phones. Due to their great commercial successes, the research into and development of MEMS accelerometers are actively pursued around the world. Nowadays, the sensitivity and dynamics of these sensors are such to allow accurate recording of earthquakes with moderate to strong magnitude. Due to their low cost and small size, the MEMS accelerometers may be employed for the realization of high-density seismic networks. The MEMS accelerometers could be installed inside sensitive places (high vulnerability and exposure), such as schools, hospitals, public buildings and places of

  8. MyShake: A smartphone seismic network for earthquake early warning and beyond

    PubMed Central

    Kong, Qingkai; Allen, Richard M.; Schreier, Louis; Kwon, Young-Woo

    2016-01-01

    Large magnitude earthquakes in urban environments continue to kill and injure tens to hundreds of thousands of people, inflicting lasting societal and economic disasters. Earthquake early warning (EEW) provides seconds to minutes of warning, allowing people to move to safe zones and automated slowdown and shutdown of transit and other machinery. The handful of EEW systems operating around the world use traditional seismic and geodetic networks that exist only in a few nations. Smartphones are much more prevalent than traditional networks and contain accelerometers that can also be used to detect earthquakes. We report on the development of a new type of seismic system, MyShake, that harnesses personal/private smartphone sensors to collect data and analyze earthquakes. We show that smartphones can record magnitude 5 earthquakes at distances of 10 km or less and develop an on-phone detection capability to separate earthquakes from other everyday shakes. Our proof-of-concept system then collects earthquake data at a central site where a network detection algorithm confirms that an earthquake is under way and estimates the location and magnitude in real time. This information can then be used to issue an alert of forthcoming ground shaking. MyShake could be used to enhance EEW in regions with traditional networks and could provide the only EEW capability in regions without. In addition, the seismic waveforms recorded could be used to deliver rapid microseism maps, study impacts on buildings, and possibly image shallow earth structure and earthquake rupture kinematics. PMID:26933682

  9. Ergodicity in natural earthquake fault networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiampo, K. F.; Rundle, J. B.; Holliday, J.

    2007-06-15

    Numerical simulations have shown that certain driven nonlinear systems can be characterized by mean-field statistical properties often associated with ergodic dynamics [C. D. Ferguson, W. Klein, and J. B. Rundle, Phys. Rev. E 60, 1359 (1999); D. Egolf, Science 287, 101 (2000)]. These driven mean-field threshold systems feature long-range interactions and can be treated as equilibriumlike systems with statistically stationary dynamics over long time intervals. Recently the equilibrium property of ergodicity was identified in an earthquake fault system, a natural driven threshold system, by means of the Thirumalai-Mountain (TM) fluctuation metric developed in the study of diffusive systems [K. F.more » Tiampo, J. B. Rundle, W. Klein, J. S. Sa Martins, and C. D. Ferguson, Phys. Rev. Lett. 91, 238501 (2003)]. We analyze the seismicity of three naturally occurring earthquake fault networks from a variety of tectonic settings in an attempt to investigate the range of applicability of effective ergodicity, using the TM metric and other related statistics. Results suggest that, once variations in the catalog data resulting from technical and network issues are accounted for, all of these natural earthquake systems display stationary periods of metastable equilibrium and effective ergodicity that are disrupted by large events. We conclude that a constant rate of events is an important prerequisite for these periods of punctuated ergodicity and that, while the level of temporal variability in the spatial statistics is the controlling factor in the ergodic behavior of seismic networks, no single statistic is sufficient to ensure quantification of ergodicity. Ergodicity in this application not only requires that the system be stationary for these networks at the applicable spatial and temporal scales, but also implies that they are in a state of metastable equilibrium, one in which the ensemble averages can be substituted for temporal averages in studying their

  10. Welcome to Pacific Earthquake Engineering Research Center - PEER

    Science.gov Websites

    Triggering and Effects at Silty Soil Sites" - PEER Research Project Highlight: "Dissipative Base ; Upcoming Events More June 10-13, 2018 Geotechnical Earthquake Engineering and Soil Dynamics V 2018 - Call

  11. The Ordered Network Structure and Prediction Summary for M≥7 Earthquakes in Xinjiang Region of China

    NASA Astrophysics Data System (ADS)

    Men, Ke-Pei; Zhao, Kai

    2014-12-01

    M ≥7 earthquakes have showed an obvious commensurability and orderliness in Xinjiang of China and its adjacent region since 1800. The main orderly values are 30 a × k (k = 1,2,3), 11 12 a, 41 43 a, 18 19 a, and 5 6 a. In the guidance of the information forecasting theory of Wen-Bo Weng, based on previous research results, combining ordered network structure analysis with complex network technology, we focus on the prediction summary of M ≥ 7 earthquakes by using the ordered network structure, and add new information to further optimize network, hence construct the 2D- and 3D-ordered network structure of M ≥ 7 earthquakes. In this paper, the network structure revealed fully the regularity of seismic activity of M ≥ 7 earthquakes in the study region during the past 210 years. Based on this, the Karakorum M7.1 earthquake in 1996, the M7.9 earthquake on the frontier of Russia, Mongol, and China in 2003, and two Yutian M7.3 earthquakes in 2008 and 2014 were predicted successfully. At the same time, a new prediction opinion is presented that the future two M ≥ 7 earthquakes will probably occur around 2019 - 2020 and 2025 - 2026 in this region. The results show that large earthquake occurred in defined region can be predicted. The method of ordered network structure analysis produces satisfactory results for the mid-and-long term prediction of M ≥ 7 earthquakes.

  12. The wireless networking system of Earthquake precursor mobile field observation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Teng, Y.; Wang, X.; Fan, X.; Wang, X.

    2012-12-01

    The mobile field observation network could be real-time, reliably record and transmit large amounts of data, strengthen the physical signal observations in specific regions and specific period, it can improve the monitoring capacity and abnormal tracking capability. According to the features of scatter everywhere, a large number of current earthquake precursor observation measuring points, networking technology is based on wireless broadband accessing McWILL system, the communication system of earthquake precursor mobile field observation would real-time, reliably transmit large amounts of data to the monitoring center from measuring points through the connection about equipment and wireless accessing system, broadband wireless access system and precursor mobile observation management center system, thereby implementing remote instrument monitoring and data transmition. At present, the earthquake precursor field mobile observation network technology has been applied to fluxgate magnetometer array geomagnetic observations of Tianzhu, Xichang,and Xinjiang, it can be real-time monitoring the working status of the observational instruments of large area laid after the last two or three years, large scale field operation. Therefore, it can get geomagnetic field data of the local refinement regions and provide high-quality observational data for impending earthquake tracking forecast. Although, wireless networking technology is very suitable for mobile field observation with the features of simple, flexible networking etc, it also has the phenomenon of packet loss etc when transmitting a large number of observational data due to the wireless relatively weak signal and narrow bandwidth. In view of high sampling rate instruments, this project uses data compression and effectively solves the problem of data transmission packet loss; Control commands, status data and observational data transmission use different priorities and means, which control the packet loss rate within

  13. Earthquake Complex Network applied along the Chilean Subduction Zone.

    NASA Astrophysics Data System (ADS)

    Martin, F.; Pasten, D.; Comte, D.

    2017-12-01

    In recent years the earthquake complex networks have been used as a useful tool to describe and characterize the behavior of seismicity. The earthquake complex network is built in space, dividing the three dimensional space in cubic cells. If the cubic cell contains a hypocenter, we call this cell like a node. The connections between nodes follows the time sequence of the occurrence of the seismic events. In this sense, we have a spatio-temporal configuration of a specific region using the seismicity in that zone. In this work, we are applying complex networks to characterize the subduction zone along the coast of Chile using two networks: a directed and an undirected network. The directed network takes in consideration the time-direction of the connections, that is very important for the connectivity of the network: we are considering the connectivity, ki of the i-th node, like the number of connections going out from the node i and we add the self-connections (if two seismic events occurred successive in time in the same cubic cell, we have a self-connection). The undirected network is the result of remove the direction of the connections and the self-connections from the directed network. These two networks were building using seismic data events recorded by CSN (Chilean Seismological Center) in Chile. This analysis includes the last largest earthquakes occurred in Iquique (April 2014) and in Illapel (September 2015). The result for the directed network shows a change in the value of the critical exponent along the Chilean coast. The result for the undirected network shows a small-world behavior without important changes in the topology of the network. Therefore, the complex network analysis shows a new form to characterize the Chilean subduction zone with a simple method that could be compared with another methods to obtain more details about the behavior of the seismicity in this region.

  14. A new algorithm to detect earthquakes outside the seismic network: preliminary results

    NASA Astrophysics Data System (ADS)

    Giudicepietro, Flora; Esposito, Antonietta Maria; Ricciolino, Patrizia

    2017-04-01

    In this text we are going to present a new technique for detecting earthquakes outside the seismic network, which are often the cause of fault of automatic analysis system. Our goal is to develop a robust method that provides the discrimination result as quickly as possible. We discriminate local earthquakes from regional earthquakes, both recorded at SGG station, equipped with short period sensors, operated by Osservatorio Vesuviano (INGV) in the Southern Apennines (Italy). The technique uses a Multi Layer Perceptron (MLP) neural network with an architecture composed by an input layer, a hidden layer and a single node output layer. We pre-processed the data using the Linear Predictive Coding (LPC) technique to extract the spectral features of the signals in a compact form. We performed several experiments by shortening the signal window length. In particular, we used windows of 4, 2 and 1 seconds containing the onset of the local and the regional earthquakes. We used a dataset of 103 local earthquakes and 79 regional earthquakes, most of which occurred in Greece, Albania and Crete. We split the dataset into a training set, for the network training, and a testing set to evaluate the network's capacity of discrimination. In order to assess the network stability, we repeated this procedure six times, randomly changing the data composition of the training and testing set and the initial weights of the net. We estimated the performance of this method by calculating the average of correct detection percentages obtained for each of the six permutations. The average performances are 99.02%, 98.04% and 98.53%, which concern respectively the experiments carried out on 4, 2 and 1 seconds signal windows. The results show that our method is able to recognize the earthquakes outside the seismic network using only the first second of the seismic records, with a suitable percentage of correct detection. Therefore, this algorithm can be profitably used to make earthquake automatic

  15. GeoMO 2008--geotechnical earthquake engineering : site response.

    DOT National Transportation Integrated Search

    2008-10-01

    The theme of GeoMO2008 has recently become of more interest to the Midwest civil engineering community due to the perceived earthquake risks and new code requirements. The constant seismic reminder for the New Madrid Seismic Zone and new USGS hazard ...

  16. 78 FR 775 - Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division Alpharetta, GA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-04

    ...,846B; TA-W-81,846C; TA-W-81,846D] Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division Alpharetta, GA; Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division Hunt Valley, MD; Goodman Networks, Inc. Core Network Engineering (Deployment Engineering) Division...

  17. 78 FR 12359 - Goodman Networks, Inc., Core Network Engineering (Deployment Engineering) Division Including...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-22

    ...., Core Network Engineering (Deployment Engineering) Division Including Workers in the Core Network Engineering (Deployment Engineering) Division in Alpharetta, GA, Hunt Valley, MD, Naperville, IL, and St... Reconsideration applicable to workers and former workers of Goodman Networks, Inc., Core Network Engineering...

  18. Aftershocks of the India Republic Day Earthquake: the MAEC/ISTAR Temporary Seismograph Network

    NASA Astrophysics Data System (ADS)

    Bodin, P.; Horton, S.; Johnston, A.; Patterson, G.; Bollwerk, J.; Rydelek, P.; Steiner, G.; McGoldrick, C.; Budhbhatti, K. P.; Shah, R.; Macwan, N.

    2001-05-01

    The MW=7.7 Republic Day (26 January, 2001) earthquake on the Kachchh in western India initiated a strong sequence of small aftershocks. Seventeen days following the mainshock, we deployed a network of portable digital event recorders as a cooperative project of the Mid America Earthquake Center in the US and the Institute for Scientific and Technological Advanced Research. Our network consisted of 8 event-triggered Kinemetrics K2 seismographs with 6 data channels (3 accelerometer, 3 Mark L-28/3d seismometer) sampled at 200 Hz, and one continuously-recording Guralp CMG40TD broad-band seismometer sampled at 220 Hz. This network was in place for 18 days. Underlying our network deployment was the notion that because of its tectonic and geologic setting the Republic Day earthquake and its aftershocks might have source and/or propagation characteristics common to earthquakes in stable continental plate-interiors rather than those on plate boundaries or within continental mobile belts. Thus, our goals were to provide data that could be used to compare the Republic Day earthquake with other earthquakes. In particular, the objectives of our network deployment were: (1) to characterize the spatial distribution and occurrence rates of aftershocks, (2) to examine source characteristics of the aftershocks (stress-drops, focal mechanisms), (3) to study the effect of deep unconsolidated sediment on wave propagation, and (4) to determine if other faults (notably the Allah Bundh) were simultaneously active. Most of our sites were on Jurassic bedrock, and all were either free-field, or on the floor of light structures built on rock or with a thin soil cover. However, one of our stations was on a section of unconsolidated sediments hundreds of meters thick adjacent to a site that was subjected to shaking-induced sediment liquefaction during the mainshock. The largest aftershock reported by global networks was an MW=5.9 event on January 28, prior to our deployment. The largest

  19. Rapid Modeling of and Response to Large Earthquakes Using Real-Time GPS Networks (Invited)

    NASA Astrophysics Data System (ADS)

    Crowell, B. W.; Bock, Y.; Squibb, M. B.

    2010-12-01

    Real-time GPS networks have the advantage of capturing motions throughout the entire earthquake cycle (interseismic, seismic, coseismic, postseismic), and because of this, are ideal for real-time monitoring of fault slip in the region. Real-time GPS networks provide the perfect supplement to seismic networks, which operate with lower noise and higher sampling rates than GPS networks, but only measure accelerations or velocities, putting them at a supreme disadvantage for ascertaining the full extent of slip during a large earthquake in real-time. Here we report on two examples of rapid modeling of recent large earthquakes near large regional real-time GPS networks. The first utilizes Japan’s GEONET consisting of about 1200 stations during the 2003 Mw 8.3 Tokachi-Oki earthquake about 100 km offshore Hokkaido Island and the second investigates the 2010 Mw 7.2 El Mayor-Cucapah earthquake recorded by more than 100 stations in the California Real Time Network. The principal components of strain were computed throughout the networks and utilized as a trigger to initiate earthquake modeling. Total displacement waveforms were then computed in a simulated real-time fashion using a real-time network adjustment algorithm that fixes a station far away from the rupture to obtain a stable reference frame. Initial peak ground displacement measurements can then be used to obtain an initial size through scaling relationships. Finally, a full coseismic model of the event can be run minutes after the event, given predefined fault geometries, allowing emergency first responders and researchers to pinpoint the regions of highest damage. Furthermore, we are also investigating using total displacement waveforms for real-time moment tensor inversions to look at spatiotemporal variations in slip.

  20. Real-Time Earthquake Analysis for Disaster Mitigation (READI) Network

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2014-12-01

    Real-time GNSS networks are making a significant impact on our ability to forecast, assess, and mitigate the effects of geological hazards. I describe the activities of the Real-time Earthquake Analysis for Disaster Mitigation (READI) working group. The group leverages 600+ real-time GPS stations in western North America operated by UNAVCO (PBO network), Central Washington University (PANGA), US Geological Survey & Scripps Institution of Oceanography (SCIGN project), UC Berkeley & US Geological Survey (BARD network), and the Pacific Geosciences Centre (WCDA project). Our goal is to demonstrate an earthquake and tsunami early warning system for western North America. Rapid response is particularly important for those coastal communities that are in the near-source region of large earthquakes and may have only minutes of warning time, and who today are not adequately covered by existing seismic and basin-wide ocean-buoy monitoring systems. The READI working group is performing comparisons of independent real time analyses of 1 Hz GPS data for station displacements and is participating in government-sponsored earthquake and tsunami exercises in the Western U.S. I describe a prototype seismogeodetic system using a cluster of southern California stations that includes GNSS tracking and collocation with MEMS accelerometers for real-time estimation of seismic velocity and displacement waveforms, which has advantages for improved earthquake early warning and tsunami forecasts compared to seismic-only or GPS-only methods. The READI working group's ultimate goal is to participate in an Indo-Pacific Tsunami early warning system that utilizes GNSS real-time displacements and ionospheric measurements along with seismic, near-shore buoys and ocean-bottom pressure sensors, where available, to rapidly estimate magnitude and finite fault slip models for large earthquakes, and then forecast tsunami source, energy scale, geographic extent, inundation and runup. This will require

  1. The Lice, Turkey, earthquake of September 6, 1975; a preliminary engineering investigation

    USGS Publications Warehouse

    Yanev, P. I.

    1976-01-01

    The Fifth European Conference on Earthquake Engineering was held on September 22 through 25 in Istanbul, Turkey. The opening speech by the Honorable H. E. Nurettin Ok, Minister of Reconstruction and Resettlement of Turkey, introduced the several hundred delegates to the realities of earthquake hazards in Turkey:

  2. Earthquakes Magnitude Predication Using Artificial Neural Network in Northern Red Sea Area

    NASA Astrophysics Data System (ADS)

    Alarifi, A. S.; Alarifi, N. S.

    2009-12-01

    Earthquakes are natural hazards that do not happen very often, however they may cause huge losses in life and property. Early preparation for these hazards is a key factor to reduce their damage and consequence. Since early ages, people tried to predicate earthquakes using simple observations such as strange or a typical animal behavior. In this paper, we study data collected from existing earthquake catalogue to give better forecasting for future earthquakes. The 16000 events cover a time span of 1970 to 2009, the magnitude range from greater than 0 to less than 7.2 while the depth range from greater than 0 to less than 100km. We propose a new artificial intelligent predication system based on artificial neural network, which can be used to predicate the magnitude of future earthquakes in northern Red Sea area including the Sinai Peninsula, the Gulf of Aqaba, and the Gulf of Suez. We propose a feed forward new neural network model with multi-hidden layers to predicate earthquakes occurrences and magnitudes in northern Red Sea area. Although there are similar model that have been published before in different areas, to our best knowledge this is the first neural network model to predicate earthquake in northern Red Sea area. Furthermore, we present other forecasting methods such as moving average over different interval, normally distributed random predicator, and uniformly distributed random predicator. In addition, we present different statistical methods and data fitting such as linear, quadratic, and cubic regression. We present a details performance analyses of the proposed methods for different evaluation metrics. The results show that neural network model provides higher forecast accuracy than other proposed methods. The results show that neural network achieves an average absolute error of 2.6% while an average absolute error of 3.8%, 7.3% and 6.17% for moving average, linear regression and cubic regression, respectively. In this work, we show an analysis

  3. Earthquake Monitoring: SeisComp3 at the Swiss National Seismic Network

    NASA Astrophysics Data System (ADS)

    Clinton, J. F.; Diehl, T.; Cauzzi, C.; Kaestli, P.

    2011-12-01

    The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismicity monitoring capability for Switzerland. This is a crucial issue for a country with low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations with spacing of ~25km, the SED operates one of the densest broadband networks in the world, which is complimented by ~ 50 realtime strong motion stations. The strong motion network is expected to grow with an additional ~80 stations over the next few years. Furthermore, the backbone of the network is complemented by broadband data from surrounding countries and temporary sub-networks for local monitoring of microseismicity (e.g. at geothermal sites). The variety of seismic monitoring responsibilities as well as the anticipated densifications of our network demands highly flexible processing software. We are transitioning all software to the SeisComP3 (SC3) framework. SC3 is a fully featured automated real-time earthquake monitoring software developed by GeoForschungZentrum Potsdam in collaboration with commercial partner, gempa GmbH. It is in its core open source, and becoming a community standard software for earthquake detection and waveform processing for regional and global networks across the globe. SC3 was originally developed for regional and global rapid monitoring of potentially tsunamagenic earthquakes. In order to fulfill the requirements of a local network recording moderate seismicity, SED has tuned configurations and added several modules. In this contribution, we present our SC3 implementation strategy, focusing on the detection and identification of seismicity on different scales. We operate several parallel processing "pipelines" to detect and locate local, regional and global seismicity. Additional pipelines with lower detection thresholds can be defined to monitor seismicity within dense subnets of the network. To be consistent with existing processing

  4. NGA West 2 | Pacific Earthquake Engineering Research Center

    Science.gov Websites

    , multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors

  5. SCIGN; new Southern California GPS network advances the study of earthquakes

    USGS Publications Warehouse

    Hudnut, Ken; King, Nancy

    2001-01-01

    Southern California is a giant jigsaw puzzle, and scientists are now using GPS satellites to track the pieces. These puzzle pieces are continuously moving, slowly straining the faults in between. That strain is then eventually released in earthquakes. The innovative Southern California Integrated GPS Network (SCIGN) tracks the motions of these pieces over most of southern California with unprecedented precision. This new network greatly improves the ability to assess seismic hazards and quickly measure the larger displacements that occur during and immediatelyafter earthquakes.

  6. Convolutional neural network for earthquake detection and location

    PubMed Central

    Perol, Thibaut; Gharbi, Michaël; Denolle, Marine

    2018-01-01

    The recent evolution of induced seismicity in Central United States calls for exhaustive catalogs to improve seismic hazard assessment. Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today’s most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. We leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for earthquake detection and location from a single waveform. We apply our technique to study the induced seismicity in Oklahoma, USA. We detect more than 17 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm is orders of magnitude faster than established methods. PMID:29487899

  7. Simulating and analyzing engineering parameters of Kyushu Earthquake, Japan, 1997, by empirical Green function method

    NASA Astrophysics Data System (ADS)

    Li, Zongchao; Chen, Xueliang; Gao, Mengtan; Jiang, Han; Li, Tiefei

    2017-03-01

    Earthquake engineering parameters are very important in the engineering field, especially engineering anti-seismic design and earthquake disaster prevention. In this study, we focus on simulating earthquake engineering parameters by the empirical Green's function method. The simulated earthquake (MJMA6.5) occurred in Kyushu, Japan, 1997. Horizontal ground motion is separated as fault parallel and fault normal, in order to assess characteristics of two new direction components. Broadband frequency range of ground motion simulation is from 0.1 to 20 Hz. Through comparing observed parameters and synthetic parameters, we analyzed distribution characteristics of earthquake engineering parameters. From the comparison, the simulated waveform has high similarity with the observed waveform. We found the following. (1) Near-field PGA attenuates radically all around with strip radiation patterns in fault parallel while radiation patterns of fault normal is circular; PGV has a good similarity between observed record and synthetic record, but has different distribution characteristic in different components. (2) Rupture direction and terrain have a large influence on 90 % significant duration. (3) Arias Intensity is attenuating with increasing epicenter distance. Observed values have a high similarity with synthetic values. (4) Predominant period is very different in the part of Kyushu in fault normal. It is affected greatly by site conditions. (5) Most parameters have good reference values where the hypo-central is less than 35 km. (6) The GOF values of all these parameters are generally higher than 45 which means a good result according to Olsen's classification criterion. Not all parameters can fit well. Given these synthetic ground motion parameters, seismic hazard analysis can be performed and earthquake disaster analysis can be conducted in future urban planning.

  8. The MeSO-net (Metropolitan Seismic Observation network) confronts the Pacific Coast of Tohoku Earthquake, Japan (Mw 9.0)

    NASA Astrophysics Data System (ADS)

    Kasahara, K.; Nakagawa, S.; Sakai, S.; Nanjo, K.; Panayotopoulos, Y.; Morita, Y.; Tsuruoka, H.; Kurashimo, E.; Obara, K.; Hirata, N.; Aketagawa, T.; Kimura, H.

    2011-12-01

    On April 2007, we have launched the special project for earthquake disaster mitigation in the Tokyo Metropolitan area (Fiscal 2007-2011). As a part of this project, construction of the MeSO-net (Metropolitan Seismic Observation network) has been completed, with about 300 stations deployed at mainly elementary and junior-high schools with an interval of about 5 km in space. This results in a highly dense network that covers the metropolitan area. To achieve stable seismic observation with lower surface ground noise, relative to a measurement on the surface, sensors of all stations were installed in boreholes at a depth of about 20m. The sensors have a wide dynamic range (135dB) and a wide frequency band (DC to 80Hz). Data are digitized with 200Hz sampling and telemetered to the Earthquake Research Institute, University of Tokyo. The MeSO-net that can detect and locate most earthquakes with magnitudes above 2.5 provides a unique baseline in scientific and engineering researches on the Tokyo metropolitan area, as follows. One of the main contributions is to greatly improve the image of the Philippine Sea plate (PSP) (Nakagawa et al., 2010) and provides an accurate estimation of the plate boundaries between the PSP and the Pacific plate, allowing us to possibly discuss clear understanding of the relation between the PSP deformation and M7+ intra-slab earthquake generation. Also, the latest version of the plate model in the metropolitan area, proposed by our project, attracts various researchers, comparing with highly-accurate solutions of fault mechanism, repeating earthquakes, etc. Moreover, long-periods ground motions generated by the 2011 earthquake off the Pacific coast of Tohoku earthquake (Mw 9.0) were observed by the MeSO-net and analyzed to obtain the Array Back-Projection Imaging of this event (Honda et al., 2011). As a result, the overall pattern of the imaged asperities coincides well with the slip distribution determined based on other waveform inversion

  9. Building Capacity for Earthquake Monitoring: Linking Regional Networks with the Global Community

    NASA Astrophysics Data System (ADS)

    Willemann, R. J.; Lerner-Lam, A.

    2006-12-01

    Installing or upgrading a seismic monitoring network is often among the mitigation efforts after earthquake disasters, and this is happening in response to the events both in Sumatra during December 2004 and in Pakistan during October 2005. These networks can yield improved hazard assessment, more resilient buildings where they are most needed, and emergency relief directed more quickly to the worst hit areas after the next large earthquake. Several commercial organizations are well prepared for the fleeting opportunity to provide the instruments that comprise a seismic network, including sensors, data loggers, telemetry stations, and the computers and software required for the network center. But seismic monitoring requires more than hardware and software, no matter how advanced. A well-trained staff is required to select appropriate and mutually compatible components, install and maintain telemetered stations, manage and archive data, and perform the analyses that actually yield the intended benefits. Monitoring is more effective when network operators cooperate with a larger community through free and open exchange of data, sharing information about working practices, and international collaboration in research. As an academic consortium, a facility operator and a founding member of the International Federation of Digital Seismographic Networks, IRIS has access to a broad range of expertise with the skills that are required to help design, install, and operate a seismic network and earthquake analysis center, and stimulate the core training for the professional teams required to establish and maintain these facilities. But delivering expertise quickly when and where it is unexpectedly in demand requires advance planning and coordination in order to respond to the needs of organizations that are building a seismic network, either with tight time constraints imposed by the budget cycles of aid agencies following a disastrous earthquake, or as part of more informed

  10. PEER - National Information Service for Earthquake Engineering - NISEE

    Science.gov Websites

    Information Service for Earthquake Engineering - NISEE The NISEE/PEER Library is an affiliated library of the Field Station, five miles from the main Berkeley campus. Address NISEE/PEER Library, University of Regents Hours Monday - Friday, 9:00 am - 1:00 pm Open to the public NISEE/PEER Library home page. 325

  11. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    NASA Astrophysics Data System (ADS)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  12. The Los Alamos Seismic Network (LASN): Improved Network Instrumentation, Local Earthquake Catalog Updates, and Peculiar Types of Data

    NASA Astrophysics Data System (ADS)

    Roberts, P. M.; Ten Cate, J. A.; House, L. S.; Greene, M. K.; Morton, E.; Kelley, R. E.

    2013-12-01

    The Los Alamos Seismic Network (LASN) has operated for 41 years, and provided the data to locate more than 2,500 earthquakes in north-central New Mexico. The network was installed for seismic verification research, as well as to monitor and locate earthquakes near Los Alamos National Laboratory (LANL). LASN stations are the only monitoring stations in New Mexico north of Albuquerque. The original network once included 22 stations in northern Mew Mexico. With limited funding in the early 1980's, the network was downsized to 7 stations within an area of about 15 km (N-S) by 15 km (E-W), centered on Los Alamos. Over the last four years, eight additional stations have been installed, which have considerably expanded the spatial coverage of the network. Currently, 7 stations have broadband, three-component seismometers with digital telemetry, and the remaining 8 have traditional 1 Hz short-period seismometers with either analog telemetry or on-site digital recording. A vertical array of accelerometers was also installed in a wellbore on LANL property. This borehole array has 3-component digital strong-motion sensors. Recently we began upgrading the local strong-motion accelerometer (SMA) network as well, with the addition of high-resolution digitizers and high-sensitivity force-balance accelerometers (FBA). We will present an updated description of the current LASN station, instrumentation and telemetry configurations, as well as the data acquisition and event-detection software structure used to record events in Earthworm. Although more than 2,000 earthquakes were detected and located in north-central New Mexico during the first 11 years of LASN's operation (1973 to 1984), currently only 1-2 earthquakes per month are detected and located within about 150 km of Los Alamos. Over 850 of these nearby earthquakes have been located from 1973 to present. We recently updated the LASN earthquake catalog for north-central New Mexico up through 2012 and most of 2013. Locations

  13. Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network

    NASA Astrophysics Data System (ADS)

    Zulfikar, Can; Kariptas, Cagatay; Biyikoglu, Hikmet; Ozarpa, Cevat

    2017-04-01

    Earthquake Risk Reduction to Istanbul Natural Gas Distribution Network Istanbul Natural Gas Distribution Corporation (IGDAS) is one of the end users of the Istanbul Earthquake Early Warning (EEW) signal. IGDAS, the primary natural gas provider in Istanbul, operates an extensive system 9,867km of gas lines with 750 district regulators and 474,000 service boxes. The natural gas comes to Istanbul city borders with 70bar in 30inch diameter steel pipeline. The gas pressure is reduced to 20bar in RMS stations and distributed to district regulators inside the city. 110 of 750 district regulators are instrumented with strong motion accelerometers in order to cut gas flow during an earthquake event in the case of ground motion parameters exceeds the certain threshold levels. Also, state of-the-art protection systems automatically cut natural gas flow when breaks in the gas pipelines are detected. IGDAS uses a sophisticated SCADA (supervisory control and data acquisition) system to monitor the state-of-health of its pipeline network. This system provides real-time information about quantities related to pipeline monitoring, including input-output pressure, drawing information, positions of station and RTU (remote terminal unit) gates, slum shut mechanism status at 750 district regulator sites. IGDAS Real-time Earthquake Risk Reduction algorithm follows 4 stages as below: 1) Real-time ground motion data transmitted from 110 IGDAS and 110 KOERI (Kandilli Observatory and Earthquake Research Institute) acceleration stations to the IGDAS Scada Center and KOERI data center. 2) During an earthquake event EEW information is sent from IGDAS Scada Center to the IGDAS stations. 3) Automatic Shut-Off is applied at IGDAS district regulators, and calculated parameters are sent from stations to the IGDAS Scada Center and KOERI. 4) Integrated building and gas pipeline damage maps are prepared immediately after the earthquake event. The today's technology allows to rapidly estimate the

  14. The Los Alamos Seismic Network (LASN): Recent Network Upgrades and Northern New Mexico Earthquake Catalog Updates

    NASA Astrophysics Data System (ADS)

    Roberts, P. M.; House, L. S.; Greene, M.; Ten Cate, J. A.; Schultz-Fellenz, E. S.; Kelley, R.

    2012-12-01

    From the first data recorded in the fall of 1973 to now, the Los Alamos Seismograph Network (LASN) has operated for nearly 40 years. LASN data have been used to locate more than 2,500 earthquakes in north-central New Mexico. The network was installed for seismic verification research, as well as to monitor and locate earthquakes near Los Alamos National Laboratory (LANL). LASN stations are the only earthquake monitoring stations in New Mexico north of Albuquerque. In the late 1970s, LASN included 22 stations spread over a geographic area of 150 km (N-S) by 350 km (E-W), of northern New Mexico. In the early 1980s, the available funding limited the stations that could be operated to a set of 7, located within an area of about 15 km (N-S) by 15 km (E-W), centered on Los Alamos. Over the last 3 years, 6 additional stations have been installed, which have considerably expanded the spatial coverage of the network. These new stations take advantage of broadband state-of-the-art sensors as well as digital recording and telemetry technology. Currently, 7 stations have broadband, three-component seismometers with digital telemetry, and the remaining 6 have traditional 1 Hz short-period seismometers with analog telemetry. In addition, a vertical array of accelerometers was installed in a wellbore on LANL property. This borehole station has 3-component digital strong-motion sensors. In addition, four forensic strong-motion accelerometers (SMA) are operated at LANL facilities. With 3 of the new broadband stations in and around the nearby Valles Caldera, LASN is now able to monitor any very small volcano-seismic events that may be associated with the caldera. We will present a complete description of the current LASN station, instrumentation and telemetry configurations, as well as the data acquisition and event-detection software structure used to record events in Earthworm. More than 2,000 earthquakes were detected and located in north-central New Mexico during the first 11

  15. Advancing Integrated STEM Learning through Engineering Design: Sixth-Grade Students' Design and Construction of Earthquake Resistant Buildings

    ERIC Educational Resources Information Center

    English, Lyn D.; King, Donna; Smeed, Joanna

    2017-01-01

    As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…

  16. Testing the structure of earthquake networks from multivariate time series of successive main shocks in Greece

    NASA Astrophysics Data System (ADS)

    Chorozoglou, D.; Kugiumtzis, D.; Papadimitriou, E.

    2018-06-01

    The seismic hazard assessment in the area of Greece is attempted by studying the earthquake network structure, such as small-world and random. In this network, a node represents a seismic zone in the study area and a connection between two nodes is given by the correlation of the seismic activity of two zones. To investigate the network structure, and particularly the small-world property, the earthquake correlation network is compared with randomized ones. Simulations on multivariate time series of different length and number of variables show that for the construction of randomized networks the method randomizing the time series performs better than methods randomizing directly the original network connections. Based on the appropriate randomization method, the network approach is applied to time series of earthquakes that occurred between main shocks in the territory of Greece spanning the period 1999-2015. The characterization of networks on sliding time windows revealed that small-world structure emerges in the last time interval, shortly before the main shock.

  17. Dynamic fracture network around faults: implications for earthquake ruptures, ground motion and energy budget

    NASA Astrophysics Data System (ADS)

    Okubo, K.; Bhat, H. S.; Rougier, E.; Lei, Z.; Knight, E. E.; Klinger, Y.

    2017-12-01

    Numerous studies have suggested that spontaneous earthquake ruptures can dynamically induce failure in secondary fracture network, regarded as damage zone around faults. The feedbacks of such fracture network play a crucial role in earthquake rupture, its radiated wave field and the total energy budget. A novel numerical modeling tool based on the combined finite-discrete element method (FDEM), which accounts for the main rupture propagation and nucleation/propagation of secondary cracks, was used to quantify the evolution of the fracture network and evaluate its effects on the main rupture and its associated radiation. The simulations were performed with the FDEM-based software tool, Hybrid Optimization Software Suite (HOSSedu) developed by Los Alamos National Laboratory. We first modeled an earthquake rupture on a planar strike-slip fault surrounded by a brittle medium where secondary cracks can be nucleated/activated by the earthquake rupture. We show that the secondary cracks are dynamically generated dominantly on the extensional side of the fault, mainly behind the rupture front, and it forms an intricate network of fractures in the damage zone. The rupture velocity thereby significantly decreases, by 10 to 20 percent, while the supershear transition length increases in comparison to the one with purely elastic medium. It is also observed that the high-frequency component (10 to 100 Hz) of the near-field ground acceleration is enhanced by the dynamically activated fracture network, consistent with field observations. We then conducted the case study in depth with various sets of initial stress state, and friction properties, to investigate the evolution of damage zone. We show that the width of damage zone decreases in depth, forming "flower-like" structure as the characteristic slip distance in linear slip-weakening law, or the fracture energy on the fault, is kept constant with depth. Finally, we compared the fracture energy on the fault to the energy

  18. Relationship between isoseismal area and magnitude of historical earthquakes in Greece by a hybrid fuzzy neural network method

    NASA Astrophysics Data System (ADS)

    Tselentis, G.-A.; Sokos, E.

    2012-01-01

    In this paper we suggest the use of diffusion-neural-networks, (neural networks with intrinsic fuzzy logic abilities) to assess the relationship between isoseismal area and earthquake magnitude for the region of Greece. It is of particular importance to study historical earthquakes for which we often have macroseismic information in the form of isoseisms but it is statistically incomplete to assess magnitudes from an isoseismal area or to train conventional artificial neural networks for magnitude estimation. Fuzzy relationships are developed and used to train a feed forward neural network with a back propagation algorithm to obtain the final relationships. Seismic intensity data from 24 earthquakes in Greece have been used. Special attention is being paid to the incompleteness and contradictory patterns in scanty historical earthquake records. The results show that the proposed processing model is very effective, better than applying classical artificial neural networks since the magnitude macroseismic intensity target function has a strong nonlinearity and in most cases the macroseismic datasets are very small.

  19. Introduction: seismology and earthquake engineering in Central and South America.

    USGS Publications Warehouse

    Espinosa, A.F.

    1983-01-01

    Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author

  20. Data quality of seismic records from the Tohoku, Japan earthquake as recorded across the Albuquerque Seismological Laboratory networks

    USGS Publications Warehouse

    Ringler, A.T.; Gee, L.S.; Marshall, B.; Hutt, C.R.; Storm, T.

    2012-01-01

    Great earthquakes recorded across modern digital seismographic networks, such as the recent Tohoku, Japan, earthquake on 11 March 2011 (Mw = 9.0), provide unique datasets that ultimately lead to a better understanding of the Earth's structure (e.g., Pesicek et al. 2008) and earthquake sources (e.g., Ammon et al. 2011). For network operators, such events provide the opportunity to look at the performance across their entire network using a single event, as the ground motion records from the event will be well above every station's noise floor.

  1. ConvNetQuake: Convolutional Neural Network for Earthquake Detection and Location

    NASA Astrophysics Data System (ADS)

    Denolle, M.; Perol, T.; Gharbi, M.

    2017-12-01

    Over the last decades, the volume of seismic data has increased exponentially, creating a need for efficient algorithms to reliably detect and locate earthquakes. Today's most elaborate methods scan through the plethora of continuous seismic records, searching for repeating seismic signals. In this work, we leverage the recent advances in artificial intelligence and present ConvNetQuake, a highly scalable convolutional neural network for probabilistic earthquake detection and location from single stations. We apply our technique to study two years of induced seismicity in Oklahoma (USA). We detect 20 times more earthquakes than previously cataloged by the Oklahoma Geological Survey. Our algorithm detection performances are at least one order of magnitude faster than other established methods.

  2. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major

  3. Emergency seismic and CGPS networks: a first employment for the L'Aquila Mw 6.3 earthquake

    NASA Astrophysics Data System (ADS)

    Abruzzese, L.; Avallone, A.; Cecere, G.; Cattaneo, M.; Cardinale, V.; Castagnozzi, A.; Cogliano, R.; Criscuoli, F.; D'Agostino, N.; D'Ambrosio, C.; de Luca, G.; D'Anastasio, E.; Falco, L.; Flammia, V.; Migliari, F.; Minichiello, F.; Memmolo, A.; Monachesi, G.; Moschillo, R.; Pignone, M.; Pucillo, S.; Selvaggi, G.; Zarrilli, L.; Delladio, A.; Govoni, A.; Franceschi, D.; de Martin, M.; Moretti, M.

    2009-12-01

    During the last 2 years, the Istituto Nazionale di Geofisica e Vulcanologia (INGV) developed an important real-time temporary seismic network infrastructure in order to densify the Italian National Seismic Network in epicentral areas thus enhancing the localization of the micro-seismicity after main earthquake events. This real-time temporary seismic network is constituted by various mobile and autonomous seismic stations that in group of three are telemetered to a Very Small Aperture Terminal (VSAT). This system uses a dedicated bandwidth on UHF, Wi-Fi and satellite frequency that allows the data flow in real-time at INGV centre in Rome (and Grottaminarda as backup center). The deployment of the seismic network is managed in a geographical information systems (GIS) by particular scenarios that visualizes, for the epicentral area, information about instrumental seismicity, seismic risk, macroseismic felts and territorial data. Starting from digital terrain model, the surface spatial analysis (Viewshed, Observer Point) allows the geographic arrangement of the stations and relative scenarios. The April, 6th, 2009 Mw 6.3 L'Aquila destructive earthquake represented the first real-case to test the entire emergency seismic network infrastructure. Less than 6 hours after the earthquake occurrence, a first accelerometer station was already sending data at INGV seismic monitoring headquarters. A total number of 9 seismic stations have been installed within 3 days after the earthquake. Furthermore, 5 permanent GPS stations have been installed in the epicentral area within 1 to 9 days after the main shock to detect the post-seismic deformation induced by the earthquake. We will show and describe the details of the Emergency Seismic Network infrastructure, and the first results from the collected data.

  4. Hiding earthquakes from scrupulous monitoring eyes of dense local seismic networks

    NASA Astrophysics Data System (ADS)

    Bogiatzis, P.; Ishii, M.; Kiser, E.

    2012-12-01

    Accurate and complete cataloguing of aftershocks is essential for a variety of purposes, including the estimation of the mainshock rupture area, the identification of seismic gaps, and seismic hazard assessment. However, immediately following large earthquakes, the seismograms recorded by local networks are noisy, with energy arriving from hundreds of aftershocks, in addition to different seismic phases interfering with one another. This causes deterioration in the performance of detection and location of earthquakes using conventional methods such as the S-P approach. This is demonstrated by results of back-projection analysis of teleseismic data showing that a significant number of events are undetected by the Japan Meteorological Agency, within the first twenty-four hours after the Mw9.0 Tohoku-oki, Japan earthquake. The spatial distribution of the hidden events is not arbitrary. Most of these earthquakes are located close to the trench, while some are located at the outer rise. Furthermore, there is a relatively sharp trench-parallel boundary separating the detected and undetected events. We investigate the cause of these hidden earthquakes using forward modeling. The calculation of raypaths for various source locations and takeoff angles with the "shooting" method suggests that this phenomenon is a consequence of the complexities associated with subducting slab. Laterally varying velocity structure defocuses the seismic energy from shallow earthquakes located near the trench and makes the observation of P and S arrivals difficult at stations situated on mainland Japan. Full waveform simulations confirm these results. Our forward calculations also show that the probability of detection is sensitive to the depth of the event. Shallower events near the trench are more difficult to detect than deeper earthquakes that are located inside the subducting plate for which the shadow-zone effect diminishes. The modeling effort is expanded to include three

  5. An Investigation on the Crustal Deformations in Istanbul after Eastern Marmara Earthquakes in 1999

    NASA Astrophysics Data System (ADS)

    Ozludemir, M.; Ozyasar, M.

    2008-12-01

    Since the introduction of the GPS technique in mid 1970's there has been great advances in positioning activities. Today such Global Navigational Satellite Systems (GNSS) based positioning techniques are widely used in daily geodetic applications. High order geodetic network measurements are one of such geodetic applications. Such networks are established to provide reliable infrastructures for all kind of geodetic work from the production of cadastral plans to the surveying processes during the construction of engineering structures. In fact such positional information obtained in such engineering surveys could be useful for other studies as well. One of such fields is geodynamic studies where such positional information could be valuable to understand the characteristics of tectonic movements. In Turkey being located in a tectonically active zones and having major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. In this paper an example of such engineering surveys is discussed. This example is the Istanbul GPS (Global Positioning System) Network, first established in 1997 and remeasured in 2005. Between these two measurement processes two major earthquakes took place, on August 17 and November 12, 1999 with magnitudes of 7.4 and 7.2, respectively. In the first measurement campaign in 1997, a network of about 700 points was measured, while in the second campaign in 2005 more than 1800 points were positioned. In these two campaigns are existing common points. The network covers the whole Istanbul area of about 6000 km2. All network points are located on the Eurasian plate to the north of the North Anatolian Fault Zone. In this study, the horizontal and vertical movements are presented and compared with the results obtained in geodynamic studies.

  6. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  7. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  8. Earthquake location determination using data from DOMERAPI and BMKG seismic networks: A preliminary result of DOMERAPI project

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramdhan, Mohamad; Agency for Meteorology, Climatology and Geophysics of Indonesia; Nugraha, Andri Dian

    DOMERAPI project has been conducted to comprehensively study the internal structure of Merapi volcano, especially about deep structural features beneath the volcano. DOMERAPI earthquake monitoring network consists of 46 broad-band seismometers installed around the Merapi volcano. Earthquake hypocenter determination is a very important step for further studies, such as hypocenter relocation and seismic tomographic imaging. Ray paths from earthquake events occurring outside the Merapi region can be utilized to delineate the deep magma structure. Earthquakes occurring outside the DOMERAPI seismic network will produce an azimuthal gap greater than 180{sup 0}. Owing to this situation the stations from BMKG seismic networkmore » can be used jointly to minimize the azimuthal gap. We identified earthquake events manually and carefully, and then picked arrival times of P and S waves. The data from the DOMERAPI seismic network were combined with the BMKG data catalogue to determine earthquake events outside the Merapi region. For future work, we will also use the BPPTKG (Center for Research and Development of Geological Disaster Technology) data catalogue in order to study shallow structures beneath the Merapi volcano. The application of all data catalogues will provide good information as input for further advanced studies and volcano hazards mitigation.« less

  9. A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network

    NASA Astrophysics Data System (ADS)

    Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan

    2016-07-01

    Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  10. The HayWired earthquake scenario—Engineering implications

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2018-04-18

    The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.

  11. An Offshore Geophysical Network in the Pacific Northwest for Earthquake and Tsunami Early Warning and Hazard Research

    NASA Astrophysics Data System (ADS)

    Wilcock, W. S. D.; Schmidt, D. A.; Vidale, J. E.; Harrington, M.; Bodin, P.; Cram, G.; Delaney, J. R.; Gonzalez, F. I.; Kelley, D. S.; LeVeque, R. J.; Manalang, D.; McGuire, C.; Roland, E. C.; Tilley, J.; Vogl, C. J.; Stoermer, M.

    2016-12-01

    The Cascadia subduction zone hosts catastrophic earthquakes every few hundred years. On land, there are extensive geophysical networks available to monitor the subduction zone, but since the locked portion of the plate boundary lies mostly offshore, these networks are ideally complemented by seafloor observations. Such considerations helped motivate the development of scientific cabled observatories that cross the subduction zone at two sites off Vancouver Island and one off central Oregon, but these have a limited spatial footprint along the strike of the subduction zone. The Pacific Northwest Seismic Network is leading a collaborative effort to implement an earthquake early warning system in the Washington and Oregon using data streams from land networks as well as the few existing offshore instruments. For subduction zone earthquakes that initiate offshore, this system will provide a warning. However, the availability of real time offshore instrumentation along the entire subduction zone would improve its reliability and accuracy, add up to 15 s to the warning time, and ensure an early warning for coastal communities near the epicenter. Furthermore, real-time networks of seafloor pressure sensors above the subduction zone would enable monitoring and contribute to accurate predictions of the incoming tsunami. There is also strong scientific motivation for offshore monitoring. We lack a complete knowledge of the plate convergence rate and direction. Measurements of steady deformation and observations of transient processes such as fluid pulsing, microseismic cycles, tremor and slow-slip are necessary for assessing the dimensions of the locked zone and its along-strike segmentation. Long-term monitoring will also provide baseline observations that can be used to detect and evaluate changes in the subduction environment. There are significant engineering challenges to be solved to ensure the system is sufficiently reliable and maintainable. It must provide

  12. Converting Advances in Seismology into Earthquake Science

    NASA Astrophysics Data System (ADS)

    Hauksson, Egill; Shearer, Peter; Vidale, John

    2004-01-01

    Federal and state agencies and university groups all operate seismic networks in California. The U.S. Geological Survey (USGS) operates seismic networks in California in cooperation with the California Institute of Technology (Caltech) in southern California, and the University of California (UC) at Berkeley in northern California. The California Geological Survey (CGS) and the USGS National Strong Motion Program (NSMP) operate dial-out strong motion instruments in the state, primarily to capture data from large earthquakes for earthquake engineering and, more recently, emergency response. The California Governor's Office of Emergency Services (OES) provides leadership for the most recent project, the California Integrated Seismic Network (CISN), to integrate all of the California efforts, and to take advantage of the emergency response capabilities of the seismic networks. The core members of the CISN are Caltech, UC Berkeley, CGS, USGS Menlo Park, and USGS Pasadena (http://www.cisn.org). New seismic instrumentation is in place across southern California, and significant progress has been made in improving instrumentation in northern California. Since 2001, these new field instrumentation efforts, data sharing, and software development for real-time reporting and archiving have been coordinated through the California Integrated Seismic Network (CISN). The CISN is also the California region of the Advanced National Seismic Network (ANSS). In addition, EarthScope deployments of USArray that will begin in early 2004 in California are coordinated with the CISN. The southern and northern California earthquake data centers (SCEDC and NCEDC) have new capabilities that enable seismologists to obtain large volumes of data with only modest effort.

  13. The Quake-Catcher Network: Improving Earthquake Strong Motion Observations Through Community Engagement

    NASA Astrophysics Data System (ADS)

    Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.; Chung, A. I.; Neighbors, C.; Saltzman, J.

    2010-12-01

    The Quake-Catcher Network (QCN) involves the community in strong motion data collection by utilizing volunteer computing techniques and low-cost MEMS accelerometers. Volunteer computing provides a mechanism to expand strong-motion seismology with minimal infrastructure costs, while promoting community participation in science. Micro-Electro-Mechanical Systems (MEMS) triaxial accelerometers can be attached to a desktop computer via USB and are internal to many laptops. Preliminary shake table tests show the MEMS accelerometers can record high-quality seismic data with instrument response similar to research-grade strong-motion sensors. QCN began distributing sensors and software to K-12 schools and the general public in April 2008 and has grown to roughly 1500 stations worldwide. We also recently tested whether sensors could be quickly deployed as part of a Rapid Aftershock Mobilization Program (RAMP) following the 2010 M8.8 Maule, Chile earthquake. Volunteers are recruited through media reports, web-based sensor request forms, as well as social networking sites. Using data collected to date, we examine whether a distributed sensing network can provide valuable seismic data for earthquake detection and characterization while promoting community participation in earthquake science. We utilize client-side triggering algorithms to determine when significant ground shaking occurs and this metadata is sent to the main QCN server. On average, trigger metadata are received within 1-10 seconds from the observation of a trigger; the larger data latencies are correlated with greater server-station distances. When triggers are detected, we determine if the triggers correlate to others in the network using spatial and temporal clustering of incoming trigger information. If a minimum number of triggers are detected then a QCN-event is declared and an initial earthquake location and magnitude is estimated. Initial analysis suggests that the estimated locations and magnitudes are

  14. The Great Maule earthquake: seismicity prior to and after the main shock from amphibious seismic networks

    NASA Astrophysics Data System (ADS)

    Lieser, K.; Arroyo, I. G.; Grevemeyer, I.; Flueh, E. R.; Lange, D.; Tilmann, F. J.

    2013-12-01

    The Chilean subduction zone is among the seismically most active plate boundaries in the world and its coastal ranges suffer from a magnitude 8 or larger megathrust earthquake every 10-20 years. The Constitución-Concepción or Maule segment in central Chile between ~35.5°S and 37°S was considered to be a mature seismic gap, rupturing last in 1835 and being seismically quiet without any magnitude 4.5 or larger earthquakes reported in global catalogues. It is located to the north of the nucleation area of the 1960 magnitude 9.5 Valdivia earthquake and to the south of the 1928 magnitude 8 Talca earthquake. On 27 February 2010 this segment ruptured in a Mw=8.8 earthquake, nucleating near 36°S and affecting a 500-600 km long segment of the margin between 34°S and 38.5°S. Aftershocks occurred along a roughly 600 km long portion of the central Chilean margin, most of them offshore. Therefore, a network of 30 ocean-bottom-seismometers was deployed in the northern portion of the rupture area for a three month period, recording local offshore aftershocks between 20 September 2010 and 25 December 2010. In addition, data of a network consisting of 33 landstations of the GeoForschungsZentrum Potsdam were included into the network, providing an ideal coverage of both the rupture plane and areas affected by post-seismic slip as deduced from geodetic data. Aftershock locations are based on automatically detected P wave onsets and a 2.5D velocity model of the combined on- and offshore network. Aftershock seismicity analysis in the northern part of the survey area reveals a well resolved seismically active splay fault in the accretionary prism of the Chilean forearc. Our findings imply that in the northernmost part of the rupture zone, co-seismic slip most likely propagated along the splay fault and not the subduction thrust fault. In addition, the updip limit of aftershocks along the plate interface can be verified to about 40 km landwards from the deformation front. Prior to

  15. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2009-04-01

    This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster

  16. On the reliability of Quake-Catcher Network earthquake detections

    USGS Publications Warehouse

    Yildirim, Battalgazi; Cochran, Elizabeth S.; Chung, Angela I.; Christensen, Carl M.; Lawrence, Jesse F.

    2015-01-01

    Over the past two decades, there have been several initiatives to create volunteer‐based seismic networks. The Personal Seismic Network, proposed around 1990, used a short‐period seismograph to record earthquake waveforms using existing phone lines (Cranswick and Banfill, 1990; Cranswicket al., 1993). NetQuakes (Luetgert et al., 2010) deploys triaxial Micro‐Electromechanical Systems (MEMS) sensors in private homes, businesses, and public buildings where there is an Internet connection. Other seismic networks using a dense array of low‐cost MEMS sensors are the Community Seismic Network (Clayton et al., 2012; Kohler et al., 2013) and the Home Seismometer Network (Horiuchi et al., 2009). One main advantage of combining low‐cost MEMS sensors and existing Internet connection in public and private buildings over the traditional networks is the reduction in installation and maintenance costs (Koide et al., 2006). In doing so, it is possible to create a dense seismic network for a fraction of the cost of traditional seismic networks (D’Alessandro and D’Anna, 2013; D’Alessandro, 2014; D’Alessandro et al., 2014).

  17. Seismic design and engineering research at the U.S. Geological Survey

    USGS Publications Warehouse

    1988-01-01

    The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion.  Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.

  18. Contribution of the Surface and Down-Hole Seismic Networks to the Location of Earthquakes at the Soultz-sous-Forêts Geothermal Site (France)

    NASA Astrophysics Data System (ADS)

    Kinnaert, X.; Gaucher, E.; Kohl, T.; Achauer, U.

    2018-03-01

    Seismicity induced in geo-reservoirs can be a valuable observation to image fractured reservoirs, to characterize hydrological properties, or to mitigate seismic hazard. However, this requires accurate location of the seismicity, which is nowadays an important seismological task in reservoir engineering. The earthquake location (determination of the hypocentres) depends on the model used to represent the medium in which the seismic waves propagate and on the seismic monitoring network. In this work, location uncertainties and location inaccuracies are modeled to investigate the impact of several parameters on the determination of the hypocentres: the picking uncertainty, the numerical precision of picked arrival times, a velocity perturbation and the seismic network configuration. The method is applied to the geothermal site of Soultz-sous-Forêts, which is located in the Upper Rhine Graben (France) and which was subject to detailed scientific investigations. We focus on a massive water injection performed in the year 2000 to enhance the productivity of the well GPK2 in the granitic basement, at approximately 5 km depth, and which induced more than 7000 earthquakes recorded by down-hole and surface seismic networks. We compare the location errors obtained from the joint or the separate use of the down-hole and surface networks. Besides the quantification of location uncertainties caused by picking uncertainties, the impact of the numerical precision of the picked arrival times as provided in a reference catalogue is investigated. The velocity model is also modified to mimic possible effects of a massive water injection and to evaluate its impact on earthquake hypocentres. It is shown that the use of the down-hole network in addition to the surface network provides smaller location uncertainties but can also lead to larger inaccuracies. Hence, location uncertainties would not be well representative of the location errors and interpretation of the seismicity

  19. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  20. ShakeNet: a portable wireless sensor network for instrumenting large civil structures

    USGS Publications Warehouse

    Kohler, Monica D.; Hao, Shuai; Mishra, Nilesh; Govindan, Ramesh; Nigbor, Robert

    2015-08-03

    We report our findings from a U.S. Geological Survey (USGS) National Earthquake Hazards Reduction Program-funded project to develop and test a wireless, portable, strong-motion network of up to 40 triaxial accelerometers for structural health monitoring. The overall goal of the project was to record ambient vibrations for several days from USGS-instrumented structures. Structural health monitoring has important applications in fields like civil engineering and the study of earthquakes. The emergence of wireless sensor networks provides a promising means to such applications. However, while most wireless sensor networks are still in the experimentation stage, very few take into consideration the realistic earthquake engineering application requirements. To collect comprehensive data for structural health monitoring for civil engineers, high-resolution vibration sensors and sufficient sampling rates should be adopted, which makes it challenging for current wireless sensor network technology in the following ways: processing capabilities, storage limit, and communication bandwidth. The wireless sensor network has to meet expectations set by wired sensor devices prevalent in the structural health monitoring community. For this project, we built and tested an application-realistic, commercially based, portable, wireless sensor network called ShakeNet for instrumentation of large civil structures, especially for buildings, bridges, or dams after earthquakes. Two to three people can deploy ShakeNet sensors within hours after an earthquake to measure the structural response of the building or bridge during aftershocks. ShakeNet involved the development of a new sensing platform (ShakeBox) running a software suite for networking, data collection, and monitoring. Deployments reported here on a tall building and a large dam were real-world tests of ShakeNet operation, and helped to refine both hardware and software. 

  1. The network construction of CSELF for earthquake monitoring and its preliminary observation

    NASA Astrophysics Data System (ADS)

    Tang, J.; Zhao, G.; Chen, X.; Bing, H.; Wang, L.; Zhan, Y.; Xiao, Q.; Dong, Z.

    2017-12-01

    The Electromagnetic (EM) anomaly in short-term earthquake precursory is most sensitive physical phenomena. Scientists believe that EM monitoring for earthquake is one of the most promising means of forecasting. However, existing ground-base EM observation confronted with increasing impact cultural noises, and the lack of a frequency range of higher than 1Hz observations. Control source of extremely low frequency (CSELF) EM is a kind of good prospective new approach. It not only has many advantages with high S/N ratio, large coverage area, probing depth ect., thereby facilitating the identification and capture anomaly signal, and it also can be used to study the electromagnetic field variation and to study the crustal medium changes of the electric structure.The first CSELF EM network for earthquake precursory monitoring with 30 observatories in China has been constructed. The observatories distribute in Beijing surrounding area and in the southern part of North-South Seismic Zone. GMS-07 system made by Metronix is equipped at each station. The observation mixed CSELF and nature source, that is, if during the control source is off transmitted, the nature source EM signal will be recorded. In genernal, there are 3 5 frequencies signals in the 0.1-300Hz frequency band will be transmit in every morning and evening in a fixed time (length 2 hours). Besides time, natural field to extend the frequency band (0.001 1000 Hz) will be observed by using 3 sample frequencies, 4096Hz sampling rate for HF, 256Hz for MF and 16Hz for LF. The low frequency band records continuously all-day and the high and medium frequency band use a slices record, the data records by cycling acquisition in every 10 minutes with length of about 4 to 8 seconds and 64 to 128 seconds , respectively. All the data is automatically processed by server installed in the observatory. The EDI file including EM field spectrums and MT responses and time series files will be sent the data center by internet

  2. Earthquake source imaging by high-resolution array analysis at regional distances: the 2010 M7 Haiti earthquake as seen by the Venezuela National Seismic Network

    NASA Astrophysics Data System (ADS)

    Meng, L.; Ampuero, J. P.; Rendon, H.

    2010-12-01

    Back projection of teleseismic waves based on array processing has become a popular technique for earthquake source imaging,in particular to track the areas of the source that generate the strongest high frequency radiation. The technique has been previously applied to study the rupture process of the Sumatra earthquake and the supershear rupture of the Kunlun earthquakes. Here we attempt to image the Haiti earthquake using the data recorded by Venezuela National Seismic Network (VNSN). The network is composed of 22 broad-band stations with an East-West oriented geometry, and is located approximately 10 degrees away from Haiti in the perpendicular direction to the Enriquillo fault strike. This is the first opportunity to exploit the privileged position of the VNSN to study large earthquake ruptures in the Caribbean region. This is also a great opportunity to explore the back projection scheme of the crustal Pn phase at regional distances,which provides unique complementary insights to the teleseismic source inversions. The challenge in the analysis of the 2010 M7.0 Haiti earthquake is its very compact source region, possibly shorter than 30km, which is below the resolution limit of standard back projection techniques based on beamforming. Results of back projection analysis using the teleseismic USarray data reveal little details of the rupture process. To overcome the classical resolution limit we explored the Multiple Signal Classification method (MUSIC), a high-resolution array processing technique based on the signal-noise orthognality in the eigen space of the data covariance, which achieves both enhanced resolution and better ability to resolve closely spaced sources. We experiment with various synthetic earthquake scenarios to test the resolution. We find that MUSIC provides at least 3 times higher resolution than beamforming. We also study the inherent bias due to the interferences of coherent Green’s functions, which leads to a potential quantification

  3. CONEDEP: COnvolutional Neural network based Earthquake DEtection and Phase Picking

    NASA Astrophysics Data System (ADS)

    Zhou, Y.; Huang, Y.; Yue, H.; Zhou, S.; An, S.; Yun, N.

    2017-12-01

    We developed an automatic local earthquake detection and phase picking algorithm based on Fully Convolutional Neural network (FCN). The FCN algorithm detects and segments certain features (phases) in 3 component seismograms to realize efficient picking. We use STA/LTA algorithm and template matching algorithm to construct the training set from seismograms recorded 1 month before and after the Wenchuan earthquake. Precise P and S phases are identified and labeled to construct the training set. Noise data are produced by combining back-ground noise and artificial synthetic noise to form the equivalent scale of noise set as the signal set. Training is performed on GPUs to achieve efficient convergence. Our algorithm has significantly improved performance in terms of the detection rate and precision in comparison with STA/LTA and template matching algorithms.

  4. Remotely Triggered Earthquakes Recorded by EarthScope's Transportable Array and Regional Seismic Networks: A Case Study Of Four Large Earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Cerda, I.; Linville, L.; Kilb, D. L.; Pankow, K. L.

    2013-05-01

    Changes in field stress required to trigger earthquakes have been classified in two basic ways: static and dynamic triggering. Static triggering occurs when an earthquake that releases accumulated strain along a fault stress loads a nearby fault. Dynamic triggering occurs when an earthquake is induced by the passing of seismic waves from a large mainshock located at least two or more fault lengths from the epicenter of the main shock. We investigate details of dynamic triggering using data collected from EarthScope's USArray and regional seismic networks located in the United States. Triggered events are identified using an optimized automated detector based on the ratio of short term to long term average (Antelope software). Following the automated processing, the flagged waveforms are individually analyzed, in both the time and frequency domains, to determine if the increased detection rates correspond to local earthquakes (i.e., potentially remotely triggered aftershocks). Here, we show results using this automated schema applied to data from four large, but characteristically different, earthquakes -- Chile (Mw 8.8 2010), Tokoku-Oki (Mw 9.0 2011), Baja California (Mw 7.2 2010) and Wells Nevada (Mw 6.0 2008). For each of our four mainshocks, the number of detections within the 10 hour time windows span a large range (1 to over 200) and statistically >20% of the waveforms show evidence of anomalous signals following the mainshock. The results will help provide for a better understanding of the physical mechanisms involved in dynamic earthquake triggering and will help identify zones in the continental U.S. that may be more susceptible to dynamic earthquake triggering.

  5. The 2006 Java Earthquake revealed by the broadband seismograph network in Indonesia

    NASA Astrophysics Data System (ADS)

    Nakano, M.; Kumagai, H.; Miyakawa, K.; Yamashina, T.; Inoue, H.; Ishida, M.; Aoi, S.; Morikawa, N.; Harjadi, P.

    2006-12-01

    On May 27, 2006, local time, a moderate-size earthquake (Mw=6.4) occurred in central Java. This earthquake caused severe damages near Yogyakarta City, and killed more than 5700 people. To estimate the source mechanism and location of this earthquake, we performed a waveform inversion of the broadband seismograms recorded by a nationwide seismic network in Indonesia (Realtime-JISNET). Realtime-JISNET is a part of the broadband seismograph network developed by an international cooperation among Indonesia, Germany, China, and Japan, aiming at improving the capabilities to monitor seismic activity and tsunami generation in Indonesia. 12 stations in Realitme-JISNET were in operation when the earthquake occurred. We used the three-component seismograms from the two closest stations, which were located about 100 and 300 km from the source. In our analysis, we assumed pure double couple as the source mechanism, thus reducing the number of free parameters in the waveform inversion. Therefore we could stably estimate the source mechanism using the signals observed by a small number of seismic stations. We carried out a grid search with respect to strike, dip, and rake angles to investigate fault orientation and slip direction. We determined source-time functions of the moment-tensor components in the frequency domain for each set of strike, dip, and rake angles. We also conducted a spatial grid search to find the best-fit source location. The best-fit source was approximately 12 km SSE of Yogyakarta at a depth of 10 km below sea level, immediately below the area of extensive damage. The focal mechanism indicates that this earthquake was caused by compressive stress in the NS direction and strike-slip motion was dominant. The moment magnitude (Mw) was 6.4. We estimated the seismic intensity in the areas of severe damage using the source paramters and an empirical attenuation relation for averaged peak ground velocity (PGV) of horizontal seismic motion. We then calculated the

  6. Network Structure and Community Evolution on Twitter: Human Behavior Change in Response to the 2011 Japanese Earthquake and Tsunami

    PubMed Central

    Lu, Xin; Brelsford, Christa

    2014-01-01

    To investigate the dynamics of social networks and the formation and evolution of online communities in response to extreme events, we collected three datasets from Twitter shortly before and after the 2011 earthquake and tsunami in Japan. We find that while almost all users increased their online activity after the earthquake, Japanese speakers, who are assumed to be more directly affected by the event, expanded the network of people they interact with to a much higher degree than English speakers or the global average. By investigating the evolution of communities, we find that the behavior of joining or quitting a community is far from random: users tend to stay in their current status and are less likely to join new communities from solitary or shift to other communities from their current community. While non-Japanese speakers did not change their conversation topics significantly after the earthquake, nearly all Japanese users changed their conversations to earthquake-related content. This study builds a systematic framework for investigating human behaviors under extreme events with online social network data and our findings on the dynamics of networks and communities may provide useful insight for understanding how patterns of social interaction are influenced by extreme events. PMID:25346468

  7. Network Structure and Community Evolution on Twitter: Human Behavior Change in Response to the 2011 Japanese Earthquake and Tsunami

    NASA Astrophysics Data System (ADS)

    Lu, Xin; Brelsford, Christa

    2014-10-01

    To investigate the dynamics of social networks and the formation and evolution of online communities in response to extreme events, we collected three datasets from Twitter shortly before and after the 2011 earthquake and tsunami in Japan. We find that while almost all users increased their online activity after the earthquake, Japanese speakers, who are assumed to be more directly affected by the event, expanded the network of people they interact with to a much higher degree than English speakers or the global average. By investigating the evolution of communities, we find that the behavior of joining or quitting a community is far from random: users tend to stay in their current status and are less likely to join new communities from solitary or shift to other communities from their current community. While non-Japanese speakers did not change their conversation topics significantly after the earthquake, nearly all Japanese users changed their conversations to earthquake-related content. This study builds a systematic framework for investigating human behaviors under extreme events with online social network data and our findings on the dynamics of networks and communities may provide useful insight for understanding how patterns of social interaction are influenced by extreme events.

  8. Principles for selecting earthquake motions in engineering design of large dams

    USGS Publications Warehouse

    Krinitzsky, E.L.; Marcuson, William F.

    1983-01-01

    This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

  9. A stochastic automata network for earthquake simulation and hazard estimation

    NASA Astrophysics Data System (ADS)

    Belubekian, Maya Ernest

    1998-11-01

    This research develops a model for simulation of earthquakes on seismic faults with available earthquake catalog data. The model allows estimation of the seismic hazard at a site of interest and assessment of the potential damage and loss in a region. There are two approaches for studying the earthquakes: mechanistic and stochastic. In the mechanistic approach, seismic processes, such as changes in stress or slip on faults, are studied in detail. In the stochastic approach, earthquake occurrences are simulated as realizations of a certain stochastic process. In this dissertation, a stochastic earthquake occurrence model is developed that uses the results from dislocation theory for the estimation of slip released in earthquakes. The slip accumulation and release laws and the event scheduling mechanism adopted in the model result in a memoryless Poisson process for the small and moderate events and in a time- and space-dependent process for large events. The minimum and maximum of the hazard are estimated by the model when the initial conditions along the faults correspond to a situation right after a largest event and after a long seismic gap, respectively. These estimates are compared with the ones obtained from a Poisson model. The Poisson model overestimates the hazard after the maximum event and underestimates it in the period of a long seismic quiescence. The earthquake occurrence model is formulated as a stochastic automata network. Each fault is divided into cells, or automata, that interact by means of information exchange. The model uses a statistical method called bootstrap for the evaluation of the confidence bounds on its results. The parameters of the model are adjusted to the target magnitude patterns obtained from the catalog. A case study is presented for the city of Palo Alto, where the hazard is controlled by the San Andreas, Hayward and Calaveras faults. The results of the model are used to evaluate the damage and loss distribution in Palo Alto

  10. Methodology for earthquake rupture rate estimates of fault networks: example for the western Corinth rift, Greece

    NASA Astrophysics Data System (ADS)

    Chartier, Thomas; Scotti, Oona; Lyon-Caen, Hélène; Boiselet, Aurélien

    2017-10-01

    Modeling the seismic potential of active faults is a fundamental step of probabilistic seismic hazard assessment (PSHA). An accurate estimation of the rate of earthquakes on the faults is necessary in order to obtain the probability of exceedance of a given ground motion. Most PSHA studies consider faults as independent structures and neglect the possibility of multiple faults or fault segments rupturing simultaneously (fault-to-fault, FtF, ruptures). The Uniform California Earthquake Rupture Forecast version 3 (UCERF-3) model takes into account this possibility by considering a system-level approach rather than an individual-fault-level approach using the geological, seismological and geodetical information to invert the earthquake rates. In many places of the world seismological and geodetical information along fault networks is often not well constrained. There is therefore a need to propose a methodology relying on geological information alone to compute earthquake rates of the faults in the network. In the proposed methodology, a simple distance criteria is used to define FtF ruptures and consider single faults or FtF ruptures as an aleatory uncertainty, similarly to UCERF-3. Rates of earthquakes on faults are then computed following two constraints: the magnitude frequency distribution (MFD) of earthquakes in the fault system as a whole must follow an a priori chosen shape and the rate of earthquakes on each fault is determined by the specific slip rate of each segment depending on the possible FtF ruptures. The modeled earthquake rates are then compared to the available independent data (geodetical, seismological and paleoseismological data) in order to weight different hypothesis explored in a logic tree.The methodology is tested on the western Corinth rift (WCR), Greece, where recent advancements have been made in the understanding of the geological slip rates of the complex network of normal faults which are accommodating the ˜ 15 mm yr-1 north

  11. Designing an Earthquake-Proof Art Museum: An Arts- and Engineering-Integrated Science Lesson

    ERIC Educational Resources Information Center

    Carignan, Anastasia; Hussain, Mahjabeen

    2016-01-01

    In this practical arts-integrated science and engineering lesson, an inquiry-based approach was adopted to teach a class of fourth graders in a Midwest elementary school about the scientific concepts of plate tectonics and earthquakes. Lessons were prepared following the 5 E instructional model. Next Generation Science Standards (4-ESS3-2) and the…

  12. Analog earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

  13. Transportations Systems Modeling and Applications in Earthquake Engineering

    DTIC Science & Technology

    2010-07-01

    49 Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g)............... 50...Memphis, Tennessee. The NMSZ was responsible for the devastating 1811-1812 New Madrid earthquakes , the largest earthquakes ever recorded in the...Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g) Table 1 Fragility parameters for MSC steel bridge (Padgett 2007

  14. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    NASA Astrophysics Data System (ADS)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  15. The contribution of engineering surveys by means of GPS to the determination of crustal movements in Istanbul

    NASA Astrophysics Data System (ADS)

    Özyaşar, M.; Özlüdemir, M. T.

    2011-06-01

    Global Navigation Satellite Systems (GNSS) are space based positioning techniques and widely used in geodetic applications. Geodetic networking accomplished by engineering surveys constitutes one of these tasks. Geodetic networks are used as the base of all kinds of geodetic implementations, Co from the cadastral plans to the relevant surveying processes during the realization of engineering applications. Geodetic networks consist of control points positioned in a defined reference frame. In fact, such positional information could be useful for other studies as well. One of such fields is geodynamic studies that use the changes of positions of control stations within a network in a certain time period to understand the characteristics of tectonic movements. In Turkey, which is located in tectonically active zones and struck by major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. For this purpose, a GPS (Global Positioning System) network of 650 stations distributed over Istanbul (Istanbul GPS Triangulation Network; abbreviated IGNA) covering the northern part of the North Anatolian Fault Zone (NAFZ) was established in 1997 and measured in 1999. From 1998 to 2004, the IGNA network was extended to 1888 stations covering an area of about 6000 km2, the whole administration area of Istanbul. All 1888 stations within the IGNA network were remeasured in 2005. In these two campaigns there existed 452 common points, and between these two campaigns two major earthquakes took place, on 17 August and 12 November 1999 with a Richter scale magnitude of 7.4 and 7.2, respectively. Several studies conducted for estimating the horizontal and vertical displacements as a result of these earthquakes on NAFZ are discussed in this paper. In geodynamic projects carried out before the earthquakes in 1999, an annual average velocity of 2-2.5 cm for the stations along the NAFZ were estimated

  16. Natural gas network resiliency to a "shakeout scenario" earthquake.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ellison, James F.; Corbet, Thomas Frank,; Brooks, Robert E.

    2013-06-01

    A natural gas network model was used to assess the likely impact of a scenario San Andreas Fault earthquake on the natural gas network. Two disruption scenarios were examined. The more extensive damage scenario assumes the disruption of all three major corridors bringing gas into southern California. If withdrawals from the Aliso Canyon storage facility are limited to keep the amount of stored gas within historical levels, the disruption reduces Los Angeles Basin gas supplies by 50%. If Aliso Canyon withdrawals are only constrained by the physical capacity of the storage system to withdraw gas, the shortfall is reduced tomore » 25%. This result suggests that it is important for stakeholders to put agreements in place facilitating the withdrawal of Aliso Canyon gas in the event of an emergency.« less

  17. Earthquakes, Cities, and Lifelines: lessons integrating tectonics, society, and engineering in middle school Earth Science

    NASA Astrophysics Data System (ADS)

    Toke, N.; Johnson, A.; Nelson, K.

    2010-12-01

    Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions

  18. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    NASA Astrophysics Data System (ADS)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  19. The California Integrated Seismic Network

    NASA Astrophysics Data System (ADS)

    Hellweg, M.; Given, D.; Hauksson, E.; Neuhauser, D.; Oppenheimer, D.; Shakal, A.

    2007-05-01

    The mission of the California Integrated Seismic Network (CISN) is to operate a reliable, modern system to monitor earthquakes throughout the state; to generate and distribute information in real-time for emergency response, for the benefit of public safety, and for loss mitigation; and to collect and archive data for seismological and earthquake engineering research. To meet these needs, the CISN operates data processing and archiving centers, as well as more than 3000 seismic stations. Furthermore, the CISN is actively developing and enhancing its infrastructure, including its automated processing and archival systems. The CISN integrates seismic and strong motion networks operated by the University of California Berkeley (UCB), the California Institute of Technology (Caltech), and the United States Geological Survey (USGS) offices in Menlo Park and Pasadena, as well as the USGS National Strong Motion Program (NSMP), and the California Geological Survey (CGS). The CISN operates two earthquake management centers (the NCEMC and SCEMC) where statewide, real-time earthquake monitoring takes place, and an engineering data center (EDC) for processing strong motion data and making it available in near real-time to the engineering community. These centers employ redundant hardware to minimize disruptions to the earthquake detection and processing systems. At the same time, dual feeds of data from a subset of broadband and strong motion stations are telemetered in real- time directly to both the NCEMC and the SCEMC to ensure the availability of statewide data in the event of a catastrophic failure at one of these two centers. The CISN uses a backbone T1 ring (with automatic backup over the internet) to interconnect the centers and the California Office of Emergency Services. The T1 ring enables real-time exchange of selected waveforms, derived ground motion data, phase arrivals, earthquake parameters, and ShakeMaps. With the goal of operating similar and redundant

  20. Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Cavlazoglu, Baki; Stuessy, Carol

    2018-02-01

    The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.

  1. Training Course in Geotechnical and Foundation Engineering. Geotechnical Earthquake Engineering: Reference Manual. Chapters 4, Ground Motion Characterization, and 8, Liquefaction and Seismic Settlement.

    DOT National Transportation Integrated Search

    1998-12-01

    This manual was written to provide training on how to apply principles of geotechnical earthquake engineering to planning, design, and retrofit of highway facilities. Reproduced here are two chapters 4 and 8 in the settlement, respectively. These cha...

  2. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Chandy, M.; Krause, A.

    2010-12-01

    In collaboration with computer science and earthquake engineering, we are developing a dense network of low-cost accelerometers that send their data via the Internet to a cloud-based center. The goal is to make block-by-block measurements of ground shaking in urban areas, which will provide emergency response information in the case of large earthquakes, and an unprecedented high-frequency seismic array to study structure and the earthquake process with moderate shaking. When deployed in high-rise buildings they can be used to monitor the state of health of the structure. The sensors are capable of a resolution of approximately 80 micro-g, connect via USB ports to desktop computers, and cost about $100 each. The network will adapt to its environment by using network-wide machine learning to adjust the picking sensitivity. We are also looking into using other motion sensing devices such as cell phones. For a pilot project, we plan to deploy more than 1000 sensors in the greater Pasadena area. The system is easily adaptable to other seismically vulnerable urban areas.

  3. Assessment of Simulated Ground Motions in Earthquake Engineering Practice: A Case Study for Duzce (Turkey)

    NASA Astrophysics Data System (ADS)

    Karimzadeh, Shaghayegh; Askan, Aysegul; Yakut, Ahmet

    2017-09-01

    Simulated ground motions can be used in structural and earthquake engineering practice as an alternative to or to augment the real ground motion data sets. Common engineering applications of simulated motions are linear and nonlinear time history analyses of building structures, where full acceleration records are necessary. Before using simulated ground motions in such applications, it is important to assess those in terms of their frequency and amplitude content as well as their match with the corresponding real records. In this study, a framework is outlined for assessment of simulated ground motions in terms of their use in structural engineering. Misfit criteria are determined for both ground motion parameters and structural response by comparing the simulated values against the corresponding real values. For this purpose, as a case study, the 12 November 1999 Duzce earthquake is simulated using stochastic finite-fault methodology. Simulated records are employed for time history analyses of frame models of typical residential buildings. Next, the relationships between ground motion misfits and structural response misfits are studied. Results show that the seismological misfits around the fundamental period of selected buildings determine the accuracy of the simulated responses in terms of their agreement with the observed responses.

  4. Strong motion observations and recordings from the great Wenchuan Earthquake

    USGS Publications Warehouse

    Li, X.; Zhou, Z.; Yu, H.; Wen, R.; Lu, D.; Huang, M.; Zhou, Y.; Cu, J.

    2008-01-01

    The National Strong Motion Observation Network System (NSMONS) of China is briefly introduced in this paper. The NSMONS consists of permanent free-field stations, special observation arrays, mobile observatories and a network management system. During the Wenchuan Earthquake, over 1,400 components of acceleration records were obtained from 460 permanent free-field stations and three arrays for topographical effect and structural response observation in the network system from the main shock, and over 20,000 components of acceleration records from strong aftershocks occurred before August 1, 2008 were also obtained by permanent free-field stations of the NSMONS and 59 mobile instruments quickly deployed after the main shock. The strong motion recordings from the main shock and strong aftershocks are summarized in this paper. In the ground motion recordings, there are over 560 components with peak ground acceleration (PGA) over 10 Gal, the largest being 957.7 Gal. The largest PGA recorded during the aftershock exceeds 300 Gal. ?? 2008 Institute of Engineering Mechanics, China Earthquake Administration and Springer-Verlag GmbH.

  5. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  6. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  7. Earthquake Damage Assessment Using Very High Resolution Satelliteimagery

    NASA Astrophysics Data System (ADS)

    Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.

    Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.

  8. A New Network-Based Approach for the Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Alessandro, C.; Zollo, A.; Colombelli, S.; Elia, L.

    2017-12-01

    Here we propose a new method which allows for issuing an early warning based upon the real-time mapping of the Potential Damage Zone (PDZ), e.g. the epicentral area where the peak ground velocity is expected to exceed the damaging or strong shaking levels with no assumption about the earthquake rupture extent and spatial variability of ground motion. The system includes the techniques for a refined estimation of the main source parameters (earthquake location and magnitude) and for an accurate prediction of the expected ground shaking level. The system processes the 3-component, real-time ground acceleration and velocity data streams at each station. For stations providing high quality data, the characteristic P-wave period (τc) and the P-wave displacement, velocity and acceleration amplitudes (Pd, Pv and Pa) are jointly measured on a progressively expanded P-wave time window. The evolutionary estimate of these parameters at stations around the source allow to predict the geometry and extent of PDZ, but also of the lower shaking intensity regions at larger epicentral distances. This is done by correlating the measured P-wave amplitude with the Peak Ground Velocity (PGV) and Instrumental Intensity (IMM) and by interpolating the measured and predicted P-wave amplitude at a dense spatial grid, including the nodes of the accelerometer/velocimeter array deployed in the earthquake source area. Depending of the network density and spatial source coverage, this method naturally accounts for effects related to the earthquake rupture extent (e.g. source directivity) and spatial variability of strong ground motion related to crustal wave propagation and site amplification. We have tested this system by a retrospective analysis of three earthquakes: 2016 Italy 6.5 Mw, 2008 Iwate-Miyagi 6.9 Mw and 2011 Tohoku 9.0 Mw. Source parameters characterization are stable and reliable, also the intensity map shows extended source effects consistent with kinematic fracture models of

  9. An Earthquake Shake Map Routine with Low Cost Accelerometers: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alcik, H. A.; Tanircan, G.; Kaya, Y.

    2015-12-01

    Vast amounts of high quality strong motion data are indispensable inputs of the analyses in the field of geotechnical and earthquake engineering however, high cost of installation of the strong motion systems constitutes the biggest obstacle for worldwide dissemination. In recent years, MEMS based (micro-electro-mechanical systems) accelerometers have been used in seismological research-oriented studies as well as earthquake engineering oriented projects basically due to precision obtained in downsized instruments. In this research our primary goal is to ensure the usage of these low-cost instruments in the creation of shake-maps immediately after a strong earthquake. Second goal is to develop software that will automatically process the real-time data coming from the rapid response network and create shake-map. For those purposes, four MEMS sensors have been set up to deliver real-time data. Data transmission is done through 3G modems. A subroutine was coded in assembler language and embedded into the operating system of each instrument to create MiniSEED files with packages of 1-second instead of 512-byte packages.The Matlab-based software calculates the strong motion (SM) parameters at every second, and they are compared with the user-defined thresholds. A voting system embedded in the software captures the event if the total vote exceeds the threshold. The user interface of the software enables users to monitor the calculated SM parameters either in a table or in a graph (Figure 1). A small scale and affordable rapid response network is created using four MEMS sensors, and the functionality of the software has been tested and validated using shake table tests. The entire system is tested together with a reference sensor under real strong ground motion recordings as well as series of sine waves with varying amplitude and frequency. The successful realization of this software allowed us to set up a test network at Tekirdağ Province, the closest coastal point to

  10. Preparing for a Career as a Network Engineer

    ERIC Educational Resources Information Center

    Morris, Gerard; Fustos, Janos; Haga, Wayne

    2012-01-01

    A network engineer is an Information Technology (IT) professional who designs, implements, maintains, and troubleshoots computer networks. While the United States is still experiencing relatively high unemployment, demand for network engineers remains strong. To determine what skills employers are looking for, data was collected and analyzed from…

  11. Assessing earthquake early warning using sparse networks in developing countries: Case study of the Kyrgyz Republic

    NASA Astrophysics Data System (ADS)

    Parolai, Stefano; Boxberger, Tobias; Pilz, Marco; Fleming, Kevin; Haas, Michael; Pittore, Massimiliano; Petrovic, Bojana; Moldobekov, Bolot; Zubovich, Alexander; Lauterjung, Joern

    2017-09-01

    The first real-time digital strong-motion network in Central Asia has been installed in the Kyrgyz Republic since 2014. Although this network consists of only 19 strong-motion stations, they are located in near-optimal locations for earthquake early warning and rapid response purposes. In fact, it is expected that this network, which utilizes the GFZ-Sentry software, allowing decentralized event assessment calculations, not only will provide useful strong motion data useful for improving future seismic hazard and risk assessment, but will serve as the backbone for regional and on-site earthquake early warning operations. Based on the location of these stations, and travel-time estimates for P- and S-waves, we have determined potential lead times for several major urban areas in Kyrgyzstan (i.e., Bishkek, Osh, and Karakol) and Kazakhstan (Almaty), where we find the implementation of an efficient earthquake early warning system would provide lead times outside the blind zone ranging from several seconds up to several tens of seconds. This was confirmed by the simulation of the possible shaking (and intensity) that would arise considering a series of scenarios based on historical and expected events, and how they affect the major urban centres. Such lead times would allow the instigation of automatic mitigation procedures, while the system as a whole would support prompt and efficient actions to be undertaken over large areas.

  12. Rapid earthquake characterization using MEMS accelerometers and volunteer hosts following the M 7.2 Darfield, New Zealand, Earthquake

    USGS Publications Warehouse

    Lawrence, J. F.; Cochran, E.S.; Chung, A.; Kaiser, A.; Christensen, C. M.; Allen, R.; Baker, J.W.; Fry, B.; Heaton, T.; Kilb, Debi; Kohler, M.D.; Taufer, M.

    2014-01-01

    We test the feasibility of rapidly detecting and characterizing earthquakes with the Quake‐Catcher Network (QCN) that connects low‐cost microelectromechanical systems accelerometers to a network of volunteer‐owned, Internet‐connected computers. Following the 3 September 2010 M 7.2 Darfield, New Zealand, earthquake we installed over 180 QCN sensors in the Christchurch region to record the aftershock sequence. The sensors are monitored continuously by the host computer and send trigger reports to the central server. The central server correlates incoming triggers to detect when an earthquake has occurred. The location and magnitude are then rapidly estimated from a minimal set of received ground‐motion parameters. Full seismic time series are typically not retrieved for tens of minutes or even hours after an event. We benchmark the QCN real‐time detection performance against the GNS Science GeoNet earthquake catalog. Under normal network operations, QCN detects and characterizes earthquakes within 9.1 s of the earthquake rupture and determines the magnitude within 1 magnitude unit of that reported in the GNS catalog for 90% of the detections.

  13. PBO Southwest Region: Baja Earthquake Response and Network Operations

    NASA Astrophysics Data System (ADS)

    Walls, C. P.; Basset, A.; Mann, D.; Lawrence, S.; Jarvis, C.; Feaux, K.; Jackson, M. E.

    2011-12-01

    The SW region of the Plate Boundary Observatory consists of 455 continuously operating GPS stations located principally along the transform system of the San Andreas fault and Eastern California Shear Zone. In the past year network uptime exceeded an average of 97% with greater than 99% data acquisition. Communications range from CDMA modem (307), radio (92), Vsat (30), DSL/T1/other (25) to manual downloads (1). Sixty-three stations stream 1 Hz data over the VRS3Net typically with <0.5 second latency. Over 620 maintenance activities were performed during 316 onsite visits out of approximately 368 engineer field days. Within the past year there have been 7 incidences of minor (attempted theft) to moderate vandalism (solar panel stolen) with one total loss of receiver and communications gear. Security was enhanced at these sites through fencing and more secure station configurations. In the past 12 months, 4 new stations were installed to replace removed stations or to augment the network at strategic locations. Following the M7.2 El Mayor-Cucapah earthquake CGPS station P796, a deep-drilled braced monument, was constructed in San Luis, AZ along the border within 5 weeks of the event. In addition, UNAVCO participated in a successful University of Arizona-led RAPID proposal for the installation of six continuous GPS stations for post-seismic observations. Six stations are installed and telemetered through a UNAM relay at the Sierra San Pedro Martir. Four of these stations have Vaisala WXT520 meteorological sensors. An additional site in the Sierra Cucapah (PTAX) that was built by CICESE, an Associate UNAVCO Member institution in Mexico, and Caltech has been integrated into PBO dataflow. The stations will be maintained as part of the PBO network in coordination with CICESE. UNAVCO is working with NOAA to upgrade PBO stations with WXT520 meteorological sensors and communications systems capable of streaming real-time GPS and met data. The real-time GPS and

  14. S-net project: Construction of large scale seafloor observatory network for tsunamis and earthquakes in Japan

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Kanazawa, T.; Uehira, K.; Shimbo, T.; Shiomi, K.; Kunugi, T.; Aoi, S.; Matsumoto, T.; Sekiguchi, S.; Yamamoto, N.; Takahashi, N.; Shinohara, M.; Yamada, T.

    2016-12-01

    National Research Institute for Earth Science and Disaster Resilience ( NIED ) has launched the project of constructing an observatory network for tsunamis and earthquakes on the seafloor. The observatory network was named "S-net, Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench". The S-net consists of 150 seafloor observatories which are connected in line with submarine optical cables. The total length of submarine optical cable is about 5,700 km. The S-net system extends along Kuril and Japan trenches around Japan islands from north to south covering the area between southeast off island of Hokkaido and off the Boso Peninsula, Chiba Prefecture. The project has been financially supported by MEXT Japan. An observatory package is 34cm in diameter and 226cm long. Each observatory equips two units of a high sensitive water-depth sensor as a tsunami meter and four sets of three-component seismometers. The water-depth sensor has measurement resolution of sub-centimeter level. Combination of multiple seismometers secures wide dynamic range and robustness of the observation that are needed for early earthquake warning. The S-net is composed of six segment networks that consists of about 25 observatories and 800-1,600km length submarine optical cable. Five of six segment networks except the one covering the outer rise area of the Japan Trench has been already installed. The data from the observatories on those five segment networks are being transferred to the data center at NIED on a real-time basis, and then verification of data integrity are being carried out at the present moment. Installation of the last segment network of the S-net, that is, the outer rise one is scheduled to be finished within FY2016. Full-scale operation of the S-net will start at FY2017. We will report construction and operation of the S-net submarine cable system as well as the outline of the obtained data in this presentation.

  15. S-net : Construction of large scale seafloor observatory network for tsunamis and earthquakes along the Japan Trench

    NASA Astrophysics Data System (ADS)

    Mochizuki, M.; Uehira, K.; Kanazawa, T.; Shiomi, K.; Kunugi, T.; Aoi, S.; Matsumoto, T.; Sekiguchi, S.; Yamamoto, N.; Takahashi, N.; Nakamura, T.; Shinohara, M.; Yamada, T.

    2017-12-01

    NIED has launched the project of constructing a seafloor observatory network for tsunamis and earthquakes after the occurrence of the 2011 Tohoku Earthquake to enhance reliability of early warnings of tsunamis and earthquakes. The observatory network was named "S-net". The S-net project has been financially supported by MEXT.The S-net consists of 150 seafloor observatories which are connected in line with submarine optical cables. The total length of submarine optical cable is about 5,500 km. The S-net covers the focal region of the 2011 Tohoku Earthquake and its vicinity regions. Each observatory equips two units of a high sensitive pressure gauges as a tsunami meter and four sets of three-component seismometers. The S-net is composed of six segment networks. Five of six segment networks had been already installed. Installation of the last segment network covering the outer rise area have been finally finished by the end of FY2016. The outer rise segment has special features like no other five segments of the S-net. Those features are deep water and long distance. Most of 25 observatories on the outer rise segment are located at the depth of deeper than 6,000m WD. Especially, three observatories are set on the seafloor of deeper than about 7.000m WD, and then the pressure gauges capable of being used even at 8,000m WD are equipped on those three observatories. Total length of the submarine cables of the outer rise segment is about two times longer than those of the other segments. The longer the cable system is, the higher voltage supply is needed, and thus the observatories on the outer rise segment have high withstanding voltage characteristics. We employ a dispersion management line of a low loss formed by combining a plurality of optical fibers for the outer rise segment cable, in order to achieve long-distance, high-speed and large-capacity data transmission Installation of the outer rise segment was finished and then full-scale operation of S-net has started

  16. The results of the pilot project in Georgia to install a network of electromagnetic radiation before the earthquake

    NASA Astrophysics Data System (ADS)

    Machavariani, Kakhaber; Khazaradze, Giorgi; Turazashvili, Ioseb; Kachakhidze, Nino; Kachakhidze, Manana; Gogoberidze, Vitali

    2016-04-01

    The world's scientific literature recently published many very important and interesting works of VLF / LF electromagnetic emissions, which is observed in the process of earthquake preparation. This works reliable earthquake prediction in terms of trends. Because, Georgia is located in Trans Asian earthquake zone, VLF / LF electromagnetic emissions network are essential. In this regard, it was possible to take first steps. It is true that our university has Shota Rustaveli National Science Foundation № DI / 21 / 9-140 / 13 grant, which included the installation of a receiver in Georgia, but failed due to lack of funds to buy this device. However, European friends helped us (Prof. Dr. PF Biagi and Prof. Dr. Aydın BÜYÜKSARAÇ) and made possible the installation of a receiver. Turkish scientists expedition in Georgia was organized in August 2015. They brought with them VLF / LF electromagnetic emissions receiver and together with Georgian scientists install near Tbilisi. The station was named GEO-TUR. It should be noted that Georgia was involved in the work of the European network. It is possible to completely control the earthquake in Georgia in terms of electromagnetic radiation. This enables scientists to obtain the relevant information not only on the territory of our country, but also on seismically active European countries as well. In order to maintain and develop our country in this new direction, it is necessary to keep independent group of scientists who will learn electromagnetic radiation ahead of an earthquake in Georgia. At this stage, we need to remedy this shortcoming, it is necessary and appropriate specialists to Georgia to engage in a joint international research. The work is carried out in the frame of grant (DI/21/9-140/13 „Pilot project of before earthquake detected Very Low Frequency/Low Frequency electromagnetic emission network installation in Georgia") by financial support of Shota Rustaveli National Science Foundation.

  17. Neural Network-Based Sensor Validation for Turboshaft Engines

    NASA Technical Reports Server (NTRS)

    Moller, James C.; Litt, Jonathan S.; Guo, Ten-Huei

    1998-01-01

    Sensor failure detection, isolation, and accommodation using a neural network approach is described. An auto-associative neural network is configured to perform dimensionality reduction on the sensor measurement vector and provide estimated sensor values. The sensor validation scheme is applied in a simulation of the T700 turboshaft engine in closed loop operation. Performance is evaluated based on the ability to detect faults correctly and maintain stable and responsive engine operation. The set of sensor outputs used for engine control forms the network input vector. Analytical redundancy is verified by training networks of successively smaller bottleneck layer sizes. Training data generation and strategy are discussed. The engine maintained stable behavior in the presence of sensor hard failures. With proper selection of fault determination thresholds, stability was maintained in the presence of sensor soft failures.

  18. Systems engineering technology for networks

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The report summarizes research pursued within the Systems Engineering Design Laboratory at Virginia Polytechnic Institute and State University between May 16, 1993 and January 31, 1994. The project was proposed in cooperation with the Computational Science and Engineering Research Center at Howard University. Its purpose was to investigate emerging systems engineering tools and their applicability in analyzing the NASA Network Control Center (NCC) on the basis of metrics and measures.

  19. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  20. Earthquake triggering at alaskan volcanoes following the 3 November 2002 denali fault earthquake

    USGS Publications Warehouse

    Moran, S.C.; Power, J.A.; Stihler, S.D.; Sanchez, J.J.; Caplan-Auerbach, J.

    2004-01-01

    The 3 November 2002 Mw 7.9 Denali fault earthquake provided an excellent opportunity to investigate triggered earthquakes at Alaskan volcanoes. The Alaska Volcano Observatory operates short-period seismic networks on 24 historically active volcanoes in Alaska, 247-2159 km distant from the mainshock epicenter. We searched for evidence of triggered seismicity by examining the unfiltered waveforms for all stations in each volcano network for ???1 hr after the Mw 7.9 arrival time at each network and for significant increases in located earthquakes in the hours after the mainshock. We found compelling evidence for triggering only at the Katmai volcanic cluster (KVC, 720-755 km southwest of the epicenter), where small earthquakes with distinct P and 5 arrivals appeared within the mainshock coda at one station and a small increase in located earthquakes occurred for several hours after the mainshock. Peak dynamic stresses of ???0.1 MPa at Augustine Volcano (560 km southwest of the epicenter) are significantly lower than those recorded in Yellowstone and Utah (>3000 km southeast of the epicenter), suggesting that strong directivity effects were at least partly responsible for the lack of triggering at Alaskan volcanoes. We describe other incidents of earthquake-induced triggering in the KVC, and outline a qualitative magnitude/distance-dependent triggering threshold. We argue that triggering results from the perturbation of magmatic-hydrothermal systems in the KVC and suggest that the comparative lack of triggering at other Alaskan volcanoes could be a result of differences in the nature of magmatic-hydrothermal systems.

  1. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  2. The Extraction of Post-Earthquake Building Damage Informatiom Based on Convolutional Neural Network

    NASA Astrophysics Data System (ADS)

    Chen, M.; Wang, X.; Dou, A.; Wu, X.

    2018-04-01

    The seismic damage information of buildings extracted from remote sensing (RS) imagery is meaningful for supporting relief and effective reduction of losses caused by earthquake. Both traditional pixel-based and object-oriented methods have some shortcoming in extracting information of object. Pixel-based method can't make fully use of contextual information of objects. Object-oriented method faces problem that segmentation of image is not ideal, and the choice of feature space is difficult. In this paper, a new stratage is proposed which combines Convolution Neural Network (CNN) with imagery segmentation to extract building damage information from remote sensing imagery. the key idea of this method includes two steps. First to use CNN to predicate the probability of each pixel and then integrate the probability within each segmentation spot. The method is tested through extracting the collapsed building and uncollapsed building from the aerial image which is acquired in Longtoushan Town after Ms 6.5 Ludian County, Yunnan Province earthquake. The results show that the proposed method indicates its effectiveness in extracting damage information of buildings after earthquake.

  3. Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rikitake, T.

    1979-08-07

    The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less

  4. Social Media as Seismic Networks for the Earthquake Damage Assessment

    NASA Astrophysics Data System (ADS)

    Meletti, C.; Cresci, S.; La Polla, M. N.; Marchetti, A.; Tesconi, M.

    2014-12-01

    The growing popularity of online platforms, based on user-generated content, is gradually creating a digital world that mirrors the physical world. In the paradigm of crowdsensing, the crowd becomes a distributed network of sensors that allows us to understand real life events at a quasi-real-time rate. The SoS-Social Sensing project [http://socialsensing.it/] exploits the opportunistic crowdsensing, involving users in the sensing process in a minimal way, for social media emergency management purposes in order to obtain a very fast, but still reliable, detection of emergency dimension to face. First of all we designed and implemented a decision support system for the detection and the damage assessment of earthquakes. Our system exploits the messages shared in real-time on Twitter. In the detection phase, data mining and natural language processing techniques are firstly adopted to select meaningful and comprehensive sets of tweets. Then we applied a burst detection algorithm in order to promptly identify outbreaking seismic events. Using georeferenced tweets and reported locality names, a rough epicentral determination is also possible. The results, compared to Italian INGV official reports, show that the system is able to detect, within seconds, events of a magnitude in the region of 3.5 with a precision of 75% and a recall of 81,82%. We then focused our attention on damage assessment phase. We investigated the possibility to exploit social media data to estimate earthquake intensity. We designed a set of predictive linear models and evaluated their ability to map the intensity of worldwide earthquakes. The models build on a dataset of almost 5 million tweets exploited to compute our earthquake features, and more than 7,000 globally distributed earthquakes data, acquired in a semi-automatic way from USGS, serving as ground truth. We extracted 45 distinct features falling into four categories: profile, tweet, time and linguistic. We run diagnostic tests and

  5. Strong Motion Network of Medellín and Aburrá Valley: technical advances, seismicity records and micro-earthquake monitoring

    NASA Astrophysics Data System (ADS)

    Posada, G.; Trujillo, J. C., Sr.; Hoyos, C.; Monsalve, G.

    2017-12-01

    The tectonics setting of Colombia is determined by the interaction of Nazca, Caribbean and South American plates, together with the Panama-Choco block collision, which makes a seismically active region. Regional seismic monitoring is carried out by the National Seismological Network of Colombia and the Accelerometer National Network of Colombia. Both networks calculate locations, magnitudes, depths and accelerations, and other seismic parameters. The Medellín - Aburra Valley is located in the Northern segment of the Central Cordillera of Colombia, and according to the Colombian technical seismic norm (NSR-10), is a region of intermediate hazard, because of the proximity to seismic sources of the Valley. Seismic monitoring in the Aburra Valley began in 1996 with an accelerometer network which consisted of 38 instruments. Currently, the network consists of 26 stations and is run by the Early Warning System of Medellin and Aburra Valley (SIATA). The technical advances have allowed the real-time communication since a year ago, currently with 10 stations; post-earthquake data is processed through operationally near-real-time, obtaining quick results in terms of location, acceleration, spectrum response and Fourier analysis; this information is displayed at the SIATA web site. The strong motion database is composed by 280 earthquakes; this information is the basis for the estimation of seismic hazards and risk for the region. A basic statistical analysis of the main information was carried out, including the total recorded events per station, natural frequency, maximum accelerations, depths and magnitudes, which allowed us to identify the main seismic sources, and some seismic site parameters. With the idea of a more complete seismic monitoring and in order to identify seismic sources beneath the Valley, we are in the process of installing 10 low-cost shake seismometers for micro-earthquake monitoring. There is no historical record of earthquakes with a magnitude

  6. Making Initial Earthquake Catalogs from a Temporary Seismic Network for Monitoring Aftershocks

    NASA Astrophysics Data System (ADS)

    Park, J.; Kang, T. S.; Kim, K. H.; Rhie, J.; Kim, Y.

    2017-12-01

    The ML 5.1 foreshock and the ML 5.8 mainshock earthquakes occurred consecutively in Gyeongju, the southeastern part of the Korean Peninsula, on September 12, 2016. A temporary seismic network was installed quickly to observe aftershocks followed this mainshock event in the vicinity of the epicenter. The network was consisting of 27 stations equipped with broadband sensors initially and it has been operated in off-line system which required a periodic manual backup of the recorded data. We detected P-triggers and associated events by using SeisComP3 to make an initial catalogue of aftershock events rapidly. If necessary, manual picking was performed to obtain precise P- and S-arrival times from a module, scolv, included in SeisComP3. For cross-checking of reliable identification of seismic phases, a seismic python package, PhasePApy, was applied in parallel with SeisComP3. Then we get the precise relocated coordinates and depth of the aftershock events using the velellipse algorithm. The resulting dataset comprises of an initial aftershock catalog. The catalog will provide the means to address some important questions and issues on seismogenesis in this intraplate seismicity region including the 2016 Gyeongju earthquake sequence and to improve seismic hazard estimation of the region.

  7. Tweeting Earthquakes using TensorFlow

    NASA Astrophysics Data System (ADS)

    Casarotti, E.; Comunello, F.; Magnoni, F.

    2016-12-01

    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  8. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Earth Structures and Engineering Characterization of Ground Motion

    USGS Publications Warehouse

    Holzer, Thomas L.

    1998-01-01

    This chapter contains two papers that summarize the performance of engineered earth structures, dams and stabilized excavations in soil, and two papers that characterize for engineering purposes the attenuation of ground motion with distance during the Loma Prieta earthquake. Documenting the field performance of engineered structures and confirming empirically based predictions of ground motion are critical for safe and cost effective seismic design of future structures as well as the retrofitting of existing ones.

  9. Connectivity of earthquake-triggered landslides with the fluvial network: Implications for landslide sediment transport after the 2008 Wenchuan earthquake

    NASA Astrophysics Data System (ADS)

    Li, Gen; West, A. Joshua; Densmore, Alexander L.; Hammond, Douglas E.; Jin, Zhangdong; Zhang, Fei; Wang, Jin; Hilton, Robert G.

    2016-04-01

    Evaluating the influence of earthquakes on erosion, landscape evolution, and sediment-related hazards requires understanding fluvial transport of material liberated in earthquake-triggered landslides. The location of landslides relative to river channels is expected to play an important role in postearthquake sediment dynamics. In this study, we assess the position of landslides triggered by the Mw 7.9 Wenchuan earthquake, aiming to understand the relationship between landslides and the fluvial network of the steep Longmen Shan mountain range. Combining a landslide inventory map and geomorphic analysis, we quantify landslide-channel connectivity in terms of the number of landslides, landslide area, and landslide volume estimated from scaling relationships. We observe a strong spatial variability in landslide-channel connectivity, with volumetric connectivity (ξ) ranging from ~20% to ~90% for different catchments. This variability is linked to topographic effects that set local channel densities, seismic effects (including seismogenic faulting) that regulate landslide size, and substrate effects that may influence both channelization and landslide size. Altogether, we estimate that the volume of landslides connected to channels comprises 43 + 9/-7% of the total coseismic landslide volume. Following the Wenchuan earthquake, fine-grained (<~0.25 mm) suspended sediment yield across the Longmen Shan catchments is positively correlated to catchment-wide landslide density, but this correlation is statistically indistinguishable whether or not connectivity is considered. The weaker-than-expected influence of connectivity on suspended sediment yield may be related to mobilization of fine-grained landslide material that resides in hillslope domains, i.e., not directly connected to river channels. In contrast, transport of the coarser fraction (which makes up >90% of the total landslide volume) may be more significantly affected by landslide locations.

  10. A refined Frequency Domain Decomposition tool for structural modal monitoring in earthquake engineering

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2017-07-01

    Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.

  11. Seismicity in 2010 and major earthquakes recorded and located in Costa Rica from 1983 until 2012, by the local OVSICORI-UNA seismic network

    NASA Astrophysics Data System (ADS)

    Ronnie, Q.; Segura, J.; Burgoa, B.; Jimenez, W.; McNally, K. C.

    2013-05-01

    This work is the result of the analysis of existing information in the earthquake database of the Observatorio Sismológico y Vulcanológico de Costa Rica, Universidad Nacional (OVSICORI-UNA), and seeks disclosure of basic seismological information recorded and processed in 2010. In this year there was a transition between the software used to record, store and locate earthquakes. During the first three months of 2010, we used Earthworm (http://folkworm.ceri.memphis.edu/ew-doc), SEISAN (Haskov y Ottemoller, 1999) and Hypocenter (Lienert y Haskov, 1995) to capture, store and locate the earthquakes, respectively; in April 2010, ANTELOPE (http://www.brtt.com/software.html) start to be used for recording and storing and GENLOC (Fan at al, 2006) and LOCSAT (Bratt and Bache 1988), to locate earthquakes. GENLOC was used for local events and LOCSAT for regional and distant earthquakes. The local earthquakes were located using the 1D velocity model of Quintero and Kissling (2001) and for regional and distant earthquakes IASPEI91 (Kennett and Engdahl, 1991) was used. All the events for 2010 and shown in this work were rechecked by the authors. We located 3903 earthquakes in and around Costa Rica and 746 regional and distant seismic events were recorded (see Figure 1). In this work we also give a summary of major earthquakes recorded and located by OVSICORI-UNA network between 1983 and 2012. Seismicity recorded by OVSICORI-UNA network in 2010

  12. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  13. Characteristics of strong motions and damage implications of M S6.5 Ludian earthquake on August 3, 2014

    NASA Astrophysics Data System (ADS)

    Xu, Peibin; Wen, Ruizhi; Wang, Hongwei; Ji, Kun; Ren, Yefei

    2015-02-01

    The Ludian County of Yunnan Province in southwestern China was struck by an M S6.5 earthquake on August 3, 2014, which was another destructive event following the M S8.0 Wenchuan earthquake in 2008, M S7.1 Yushu earthquake in 2010, and M S7.0 Lushan earthquake in 2013. National Strong-Motion Observation Network System of China collected 74 strong motion recordings, which the maximum peak ground acceleration recorded by the 053LLT station in Longtoushan Town was 949 cm/s2 in E-W component. The observed PGAs and spectral ordinates were compared with ground-motion prediction equation in China and the NGA-West2 developed by Pacific Earthquake Engineering Researcher Center. This earthquake is considered as the first case for testing applicability of NGA-West2 in China. Results indicate that the observed PGAs and the 5 % damped pseudo-response spectral accelerations are significantly lower than the predicted ones. The field survey around some typical strong motion stations verified that the earthquake damage was consistent with the official isoseismal by China Earthquake Administration.

  14. Object-Oriented Analysis of Satellite Images Using Artificial Neural Networks for Post-Earthquake Buildings Change Detection

    NASA Astrophysics Data System (ADS)

    Khodaverdi zahraee, N.; Rastiveis, H.

    2017-09-01

    Earthquake is one of the most divesting natural events that threaten human life during history. After the earthquake, having information about the damaged area, the amount and type of damage can be a great help in the relief and reconstruction for disaster managers. It is very important that these measures should be taken immediately after the earthquake because any negligence could be more criminal losses. The purpose of this paper is to propose and implement an automatic approach for mapping destructed buildings after an earthquake using pre- and post-event high resolution satellite images. In the proposed method after preprocessing, segmentation of both images is performed using multi-resolution segmentation technique. Then, the segmentation results are intersected with ArcGIS to obtain equal image objects on both images. After that, appropriate textural features, which make a better difference between changed or unchanged areas, are calculated for all the image objects. Finally, subtracting the extracted textural features from pre- and post-event images, obtained values are applied as an input feature vector in an artificial neural network for classifying the area into two classes of changed and unchanged areas. The proposed method was evaluated using WorldView2 satellite images, acquired before and after the 2010 Haiti earthquake. The reported overall accuracy of 93% proved the ability of the proposed method for post-earthquake buildings change detection.

  15. Near-field observations of an offshore Mw 6.0 earthquake from an integrated seafloor and subseafloor monitoring network at the Nankai Trough, southwest Japan

    NASA Astrophysics Data System (ADS)

    Wallace, L. M.; Araki, E.; Saffer, D.; Wang, X.; Roesner, A.; Kopf, A.; Nakanishi, A.; Power, W.; Kobayashi, R.; Kinoshita, C.; Toczko, S.; Kimura, T.; Machida, Y.; Carr, S.

    2016-11-01

    An Mw 6.0 earthquake struck 50 km offshore the Kii Peninsula of southwest Honshu, Japan on 1 April 2016. This earthquake occurred directly beneath a cabled offshore monitoring network at the Nankai Trough subduction zone and within 25-35 km of two borehole observatories installed as part of the International Ocean Discovery Program's NanTroSEIZE project. The earthquake's location close to the seafloor and subseafloor network offers a unique opportunity to evaluate dense seafloor geodetic and seismological data in the near field of a moderate-sized offshore earthquake. We use the offshore seismic network to locate the main shock and aftershocks, seafloor pressure sensors, and borehole observatory data to determine the detailed distribution of seafloor and subseafloor deformation, and seafloor pressure observations to model the resulting tsunami. Contractional strain estimated from formation pore pressure records in the borehole observatories (equivalent to 0.37 to 0.15 μstrain) provides a key to narrowing the possible range of fault plane solutions. Together, these data show that the rupture occurred on a landward dipping thrust fault at 9-10 km below the seafloor, most likely on the plate interface. Pore pressure changes recorded in one of the observatories also provide evidence for significant afterslip for at least a few days following the main shock. The earthquake and its aftershocks are located within the coseismic slip region of the 1944 Tonankai earthquake (Mw 8.0), and immediately downdip of swarms of very low frequency earthquakes in this region, illustrating the complex distribution of megathrust slip behavior at a dominantly locked seismogenic zone.

  16. Living on an Active Earth: Perspectives on Earthquake Science

    NASA Astrophysics Data System (ADS)

    Lay, Thorne

    2004-02-01

    The annualized long-term loss due to earthquakes in the United States is now estimated at $4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of $2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.

  17. Accessing northern California earthquake data via Internet

    NASA Astrophysics Data System (ADS)

    Romanowicz, Barbara; Neuhauser, Douglas; Bogaert, Barbara; Oppenheimer, David

    The Northern California Earthquake Data Center (NCEDC) provides easy access to central and northern California digital earthquake data. It is located at the University of California, Berkeley, and is operated jointly with the U.S. Geological Survey (USGS) in Menlo Park, Calif., and funded by the University of California and the National Earthquake Hazard Reduction Program. It has been accessible to users in the scientific community through Internet since mid-1992.The data center provides an on-line archive for parametric and waveform data from two regional networks: the Northern California Seismic Network (NCSN) operated by the USGS and the Berkeley Digital Seismic Network (BDSN) operated by the Seismographic Station at the University of California, Berkeley.

  18. The Road to Total Earthquake Safety

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  19. Engine With Regression and Neural Network Approximators Designed

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Hopkins, Dale A.

    2001-01-01

    At the NASA Glenn Research Center, the NASA engine performance program (NEPP, ref. 1) and the design optimization testbed COMETBOARDS (ref. 2) with regression and neural network analysis-approximators have been coupled to obtain a preliminary engine design methodology. The solution to a high-bypass-ratio subsonic waverotor-topped turbofan engine, which is shown in the preceding figure, was obtained by the simulation depicted in the following figure. This engine is made of 16 components mounted on two shafts with 21 flow stations. The engine is designed for a flight envelope with 47 operating points. The design optimization utilized both neural network and regression approximations, along with the cascade strategy (ref. 3). The cascade used three algorithms in sequence: the method of feasible directions, the sequence of unconstrained minimizations technique, and sequential quadratic programming. The normalized optimum thrusts obtained by the three methods are shown in the following figure: the cascade algorithm with regression approximation is represented by a triangle, a circle is shown for the neural network solution, and a solid line indicates original NEPP results. The solutions obtained from both approximate methods lie within one standard deviation of the benchmark solution for each operating point. The simulation improved the maximum thrust by 5 percent. The performance of the linear regression and neural network methods as alternate engine analyzers was found to be satisfactory for the analysis and operation optimization of air-breathing propulsion engines (ref. 4).

  20. Earthquake Related Variation of Total Electron Content in Ionosphere over Chinese Mainland Derived from Observations of a Nationwide GNSS Network

    NASA Astrophysics Data System (ADS)

    Gan, Weijun

    2016-07-01

    Crustal Movement Observation Network of China (CMONOC) is a key national scientific infrastructure project carried out during 1997-2012 with 2 phases. The network is composed of 260 continuously observed GNSS stations (CORS) and 2081 campaign mode GNSS stations, with the main purpose to monitor the crustal movement, perceptible water vapor (PWV), total electron content (TEC), and many other tectonic and environmental elements around mainland China, by mainly using the Global Navigation Satellite System (GNSS) technology. Here, based on the GNSS data of 260 CORS of COMNOC for about 5 years, we investigated the characteristics of TEC in ionosphere over Chinese Mainland and discussed if there was any abnormal change of TEC before and after a big earthquake. our preliminary results show that it is hard to see any convincing precursor of TEC before a big earthquake. However, the huge energy released by a big earthquake can obviously disturb the TEC over meizoseismal area.

  1. Using Smartphones to Detect Earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.

    2012-12-01

    We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.

  2. Earthquake Analysis (EA) Software for The Earthquake Observatories

    NASA Astrophysics Data System (ADS)

    Yanik, K.; Tezel, T.

    2009-04-01

    There are many software that can used for observe the seismic signals and locate the earthquakes, but some of them commercial and has technical support. For this reason, many seismological observatories developed and use their own seismological software packets which are convenient with their seismological network. In this study, we introduce our software which has some capabilities that it can read seismic signals and process and locate the earthquakes. This software is used by the General Directorate of Disaster Affairs Earthquake Research Department Seismology Division (here after ERD) and will improve according to the new requirements. ERD network consist of 87 seismic stations that 63 of them were equipped with 24 bite digital Guralp CMG-3T, 16 of them with analogue short period S-13-Geometrics and 8 of them 24 bite digital short period S-13j-DR-24 Geometrics seismometers. Data is transmitted with satellite from broadband stations, whereas leased line used from short period stations. Daily data archive capacity is 4 GB. In big networks, it is very important that observe the seismic signals and locate the earthquakes as soon as possible. This is possible, if they use software which was developed considering their network properties. When we started to develop a software for big networks as our, we recognized some realities that all known seismic format data should be read without any convert process, observing of the only selected stations and do this on the map directly, add seismic files with import command, establishing relation between P and S phase readings and location solutions, store in database and entering to the program with user name and password. In this way, we can prevent data disorder and repeated phase readings. There are many advantages, when data store on the database proxies. These advantages are easy access to data from anywhere using ethernet, publish the bulletin and catalogues using website, easily sending of short message (sms) and e

  3. Post-Earthquake Assessment of Nevada Bridges Using ShakeMap/ShakeCast

    DOT National Transportation Integrated Search

    2016-01-01

    Post-earthquake capacity of Nevada highway bridges is examined through a combination of engineering study and scenario earthquake evaluation. The study was undertaken by the University of Nevada Reno Department of Civil and Environmental Engineering ...

  4. Seismogeodesy for rapid earthquake and tsunami characterization

    NASA Astrophysics Data System (ADS)

    Bock, Y.

    2016-12-01

    Rapid estimation of earthquake magnitude and fault mechanism is critical for earthquake and tsunami warning systems. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. These methods are well developed for ocean basin-wide warnings but are not timely enough to protect vulnerable populations and infrastructure from the effects of local tsunamis, where waves may arrive within 15-30 minutes of earthquake onset time. Direct measurements of displacements by GPS networks at subduction zones allow for rapid magnitude and slip estimation in the near-source region, that are not affected by instrumental limitations and magnitude saturation experienced by local seismic networks. However, GPS displacements by themselves are too noisy for strict earthquake early warning (P-wave detection). Optimally combining high-rate GPS and seismic data (in particular, accelerometers that do not clip), referred to as seismogeodesy, provides a broadband instrument that does not clip in the near field, is impervious to magnitude saturation, and provides accurate real-time static and dynamic displacements and velocities in real time. Here we describe a NASA-funded effort to integrate GPS and seismogeodetic observations as part of NOAA's Tsunami Warning Centers in Alaska and Hawaii. It consists of a series of plug-in modules that allow for a hierarchy of rapid seismogeodetic products, including automatic P-wave picking, hypocenter estimation, S-wave prediction, magnitude scaling relationships based on P-wave amplitude (Pd) and peak ground displacement (PGD), finite-source CMT solutions and fault slip models as input for tsunami warnings and models. For the NOAA/NASA project, the modules are being integrated into an existing USGS Earthworm environment, currently limited to traditional seismic data. We are focused on a network of

  5. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  6. A reverse engineering algorithm for neural networks, applied to the subthalamopallidal network of basal ganglia.

    PubMed

    Floares, Alexandru George

    2008-01-01

    Modeling neural networks with ordinary differential equations systems is a sensible approach, but also very difficult. This paper describes a new algorithm based on linear genetic programming which can be used to reverse engineer neural networks. The RODES algorithm automatically discovers the structure of the network, including neural connections, their signs and strengths, estimates its parameters, and can even be used to identify the biophysical mechanisms involved. The algorithm is tested on simulated time series data, generated using a realistic model of the subthalamopallidal network of basal ganglia. The resulting ODE system is highly accurate, and results are obtained in a matter of minutes. This is because the problem of reverse engineering a system of coupled differential equations is reduced to one of reverse engineering individual algebraic equations. The algorithm allows the incorporation of common domain knowledge to restrict the solution space. To our knowledge, this is the first time a realistic reverse engineering algorithm based on linear genetic programming has been applied to neural networks.

  7. Real Time Earthquake Information System in Japan

    NASA Astrophysics Data System (ADS)

    Doi, K.; Kato, T.

    2003-12-01

    An early earthquake notification system in Japan had been developed by the Japan Meteorological Agency (JMA) as a governmental organization responsible for issuing earthquake information and tsunami forecasts. The system was primarily developed for prompt provision of a tsunami forecast to the public with locating an earthquake and estimating its magnitude as quickly as possible. Years after, a system for a prompt provision of seismic intensity information as indices of degrees of disasters caused by strong ground motion was also developed so that concerned governmental organizations can decide whether it was necessary for them to launch emergency response or not. At present, JMA issues the following kinds of information successively when a large earthquake occurs. 1) Prompt report of occurrence of a large earthquake and major seismic intensities caused by the earthquake in about two minutes after the earthquake occurrence. 2) Tsunami forecast in around three minutes. 3) Information on expected arrival times and maximum heights of tsunami waves in around five minutes. 4) Information on a hypocenter and a magnitude of the earthquake, the seismic intensity at each observation station, the times of high tides in addition to the expected tsunami arrival times in 5-7 minutes. To issue information above, JMA has established; - An advanced nationwide seismic network with about 180 stations for seismic wave observation and about 3,400 stations for instrumental seismic intensity observation including about 2,800 seismic intensity stations maintained by local governments, - Data telemetry networks via landlines and partly via a satellite communication link, - Real-time data processing techniques, for example, the automatic calculation of earthquake location and magnitude, the database driven method for quantitative tsunami estimation, and - Dissemination networks, via computer-to-computer communications and facsimile through dedicated telephone lines. JMA operationally

  8. Investigation of Ionospheric Anomalies related to moderate Romanian earthquakes occurred during last decade using VLF/LF INFREP and GNSS Global Networks

    NASA Astrophysics Data System (ADS)

    Moldovan, Iren-Adelina; Oikonomou, Christina; Haralambous, Haris; Nastase, Eduard; Emilian Toader, Victorin; Biagi, Pier Francesco; Colella, Roberto; Toma-Danila, Dragos

    2017-04-01

    Ionospheric TEC (Total Electron Content) variations and Low Frequency (LF) signal amplitude data prior to five moderate earthquakes (Mw≥5) occurred in Romania, in Vrancea crustal and subcrustal seismic zones, during the last decade were analyzed using observations from the Global Navigation Satellite System (GNSS) and the European INFREP (International Network for Frontier Research on Earthquake Precursors) networks respectively, aiming to detect potential ionospheric anomalies related to these events and describe their characteristics. For this, spectral analysis on TEC data and terminator time method on VLF/LF data were applied. It was found that TEC perturbations appeared few days (1-7) up to few hours before the events lasting around 2-3 hours, with periods 20 and 3-5 minutes which could be associated with the impending earthquakes. In addition, in all three events the sunrise terminator times were delayed approximately 20-40 min few days prior and during the earthquake day. Acknowledgments This work was partially supported by the Partnership in Priority Areas Program - PNII, under MEN-UEFISCDI, DARING Project no. 69/2014 and the Nucleu Program - PN 16-35, Project no. 03 01

  9. Earthquake early warning for Romania - most recent improvements

    NASA Astrophysics Data System (ADS)

    Marmureanu, Alexandru; Elia, Luca; Martino, Claudio; Colombelli, Simona; Zollo, Aldo; Cioflan, Carmen; Toader, Victorin; Marmureanu, Gheorghe; Marius Craiu, George; Ionescu, Constantin

    2014-05-01

    EWS for Vrancea earthquakes uses the time interval (28-32 sec.) between the moment when the earthquake is detected by the local seismic network installed in the epicenter area (Vrancea) and the arrival time of the seismic waves in the protected area (Bucharest) to send earthquake warning to users. In the last years, National Institute for Earth Physics (NIEP) upgraded its seismic network in order to cover better the seismic zones of Romania. Currently the National Institute for Earth Physics (NIEP) operates a real-time seismic network designed to monitor the seismic activity on the Romania territory, dominated by the Vrancea intermediate-depth (60-200 km) earthquakes. The NIEP real-time network consists of 102 stations and two seismic arrays equipped with different high quality digitizers (Kinemetrics K2, Quanterra Q330, Quanterra Q330HR, PS6-26, Basalt), broadband and short period seismometers (CMG3ESP, CMG40T, KS2000, KS54000, KS2000, CMG3T,STS2, SH-1, S13, Ranger, gs21, Mark l22) and acceleration sensors (Episensor). Recent improvement of the seismic network and real-time communication technologies allows implementation of a nation-wide EEWS for Vrancea and other seismic sources from Romania. We present a regional approach to Earthquake Early Warning for Romania earthquakes. The regional approach is based on PRESTo (Probabilistic and Evolutionary early warning SysTem) software platform: PRESTo processes in real-time three channel acceleration data streams: once the P-waves arrival have been detected, it provides earthquake location and magnitude estimations, and peak ground motion predictions at target sites. PRESTo is currently implemented in real- time at National Institute for Earth Physics, Bucharest for several months in parallel with a secondary EEWS. The alert notification is issued only when both systems validate each other. Here we present the results obtained using offline earthquakes originating from Vrancea area together with several real

  10. The NUONCE engine for LEO networks

    NASA Technical Reports Server (NTRS)

    Lo, Martin W.; Estabrook, Polly

    1995-01-01

    Typical LEO networks use constellations which provide a uniform coverage. However, the demand for telecom service is dynamic and unevenly distributed around the world. We examine a more efficient and cost effective design by matching the satellite coverage with the cyclical demand for service around the world. Our approach is to use a non-uniform satellite distribution for the network. We have named this constellation design NUONCE for Non Uniform Optimal Network Communications Engine.

  11. QuakeUp: An advanced tool for a network-based Earthquake Early Warning system

    NASA Astrophysics Data System (ADS)

    Zollo, Aldo; Colombelli, Simona; Caruso, Alessandro; Elia, Luca; Brondi, Piero; Emolo, Antonio; Festa, Gaetano; Martino, Claudio; Picozzi, Matteo

    2017-04-01

    The currently developed and operational Earthquake Early warning, regional systems ground on the assumption of a point-like earthquake source model and 1-D ground motion prediction equations to estimate the earthquake impact. Here we propose a new network-based method which allows for issuing an alert based upon the real-time mapping of the Potential Damage Zone (PDZ), e.g. the epicentral area where the peak ground velocity is expected to exceed the damaging or strong shaking levels with no assumption about the earthquake rupture extent and spatial variability of ground motion. The platform includes the most advanced techniques for a refined estimation of the main source parameters (earthquake location and magnitude) and for an accurate prediction of the expected ground shaking level. The new software platform (QuakeUp) is under development at the Seismological Laboratory (RISSC-Lab) of the Department of Physics at the University of Naples Federico II, in collaboration with the academic spin-off company RISS s.r.l., recently gemmated by the research group. The system processes the 3-component, real-time ground acceleration and velocity data streams at each station. The signal quality is preliminary assessed by checking the signal-to-noise ratio both in acceleration, velocity and displacement and through dedicated filtering algorithms. For stations providing high quality data, the characteristic P-wave period (τ_c) and the P-wave displacement, velocity and acceleration amplitudes (P_d, Pv and P_a) are jointly measured on a progressively expanded P-wave time window. The evolutionary measurements of the early P-wave amplitude and characteristic period at stations around the source allow to predict the geometry and extent of PDZ, but also of the lower shaking intensity regions at larger epicentral distances. This is done by correlating the measured P-wave amplitude with the Peak Ground Velocity (PGV) and Instrumental Intensity (I_MM) and by mapping the measured and

  12. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    USGS Publications Warehouse

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  13. 77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... is to discuss engineering needs for existing buildings, to review the National Earthquake Hazards... Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov/ . DATES: The... assesses: Trends and developments in the science and engineering of earthquake hazards reduction; The...

  14. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  15. Efficient Reverse-Engineering of a Developmental Gene Regulatory Network

    PubMed Central

    Cicin-Sain, Damjan; Ashyraliyev, Maksat; Jaeger, Johannes

    2012-01-01

    Understanding the complex regulatory networks underlying development and evolution of multi-cellular organisms is a major problem in biology. Computational models can be used as tools to extract the regulatory structure and dynamics of such networks from gene expression data. This approach is called reverse engineering. It has been successfully applied to many gene networks in various biological systems. However, to reconstitute the structure and non-linear dynamics of a developmental gene network in its spatial context remains a considerable challenge. Here, we address this challenge using a case study: the gap gene network involved in segment determination during early development of Drosophila melanogaster. A major problem for reverse-engineering pattern-forming networks is the significant amount of time and effort required to acquire and quantify spatial gene expression data. We have developed a simplified data processing pipeline that considerably increases the throughput of the method, but results in data of reduced accuracy compared to those previously used for gap gene network inference. We demonstrate that we can infer the correct network structure using our reduced data set, and investigate minimal data requirements for successful reverse engineering. Our results show that timing and position of expression domain boundaries are the crucial features for determining regulatory network structure from data, while it is less important to precisely measure expression levels. Based on this, we define minimal data requirements for gap gene network inference. Our results demonstrate the feasibility of reverse-engineering with much reduced experimental effort. This enables more widespread use of the method in different developmental contexts and organisms. Such systematic application of data-driven models to real-world networks has enormous potential. Only the quantitative investigation of a large number of developmental gene regulatory networks will allow us to

  16. Earthquake Education and Public Information Centers: A Collaboration Between the Earthquake Country Alliance and Free-Choice Learning Institutions in California

    NASA Astrophysics Data System (ADS)

    Degroot, R. M.; Springer, K.; Brooks, C. J.; Schuman, L.; Dalton, D.; Benthien, M. L.

    2009-12-01

    In 1999 the Southern California Earthquake Center initiated an effort to expand its reach to multiple target audiences through the development of an interpretive trail on the San Andreas fault at Wallace Creek and an earthquake exhibit at Fingerprints Youth Museum in Hemet. These projects and involvement with the San Bernardino County Museum in Redlands beginning in 2007 led to the creation of Earthquake Education and Public Information Centers (EPIcenters) in 2008. The impetus for the development of the network was to broaden participation in The Great Southern California ShakeOut. In 2009 it has grown to be more comprehensive in its scope including its evolution into a statewide network. EPIcenters constitute a variety of free-choice learning institutions, representing museums, science centers, libraries, universities, parks, and other places visited by a variety of audiences including families, seniors, and school groups. They share a commitment to demonstrating and encouraging earthquake preparedness. EPIcenters coordinate Earthquake Country Alliance activities in their county or region, lead presentations or organize events in their communities, or in other ways demonstrate leadership in earthquake education and risk reduction. The San Bernardino County Museum (Southern California) and The Tech Museum of Innovation (Northern California) serve as EPIcenter regional coordinating institutions. They interact with over thirty institutional partners who have implemented a variety of activities from displays and talks to earthquake exhibitions. While many activities are focused on the time leading up to and just after the ShakeOut, most EPIcenter members conduct activities year round. Network members at Kidspace Museum in Pasadena and San Diego Natural History Museum have formed EPIcenter focus groups on early childhood education and safety and security. This presentation highlights the development of the EPIcenter network, synergistic activities resulting from this

  17. The design and implementation of urban earthquake disaster loss evaluation and emergency response decision support systems based on GIS

    NASA Astrophysics Data System (ADS)

    Yang, Kun; Xu, Quan-li; Peng, Shuang-yun; Cao, Yan-bo

    2008-10-01

    Based on the necessity analysis of GIS applications in earthquake disaster prevention, this paper has deeply discussed the spatial integration scheme of urban earthquake disaster loss evaluation models and visualization technologies by using the network development methods such as COM/DCOM, ActiveX and ASP, as well as the spatial database development methods such as OO4O and ArcSDE based on ArcGIS software packages. Meanwhile, according to Software Engineering principles, a solution of Urban Earthquake Emergency Response Decision Support Systems based on GIS technologies have also been proposed, which include the systems logical structures, the technical routes,the system realization methods and function structures etc. Finally, the testing systems user interfaces have also been offered in the paper.

  18. Consistent earthquake catalog derived from changing network configurations: Application to the Rawil Depression in the southwestern Helvetic Alps

    NASA Astrophysics Data System (ADS)

    Lee, Timothy; Diehl, Tobias; Kissling, Edi; Wiemer, Stefan

    2017-04-01

    Earthquake catalogs derived from several decades of observations are often biased by network geometries, location procedures, and data quality changing with time. To study the long-term spatio-temporal behavior of seismogenic fault zones at high-resolution, a consistent homogenization and improvement of earthquake catalogs is required. Assuming that data quality and network density generally improves with time, procedures are needed, which use the best available data to homogeneously solve the coupled hypocenter - velocity structure problem and can be as well applied to earlier network configurations in the same region. A common approach to uniformly relocate earthquake catalogs is the calculation of a so-called "minimum 1D" model, which is derived from the simultaneous inversion for hypocenters and 1D velocity structure, including station specific delay-time corrections. In this work, we will present strategies using the principles of the "minimum 1D" model to consistently relocate hypocenters recorded by the Swiss Seismological Service (SED) in the Swiss Alps over a period of 17 years in a region, which is characterized by significant changes in network configurations. The target region of this study is the Rawil depression, which is located between the Aar and Mont Blanc massifs in southwestern Switzerland. The Rhone-Simplon Fault is located to the south of the Rawil depression and is considered as a dextral strike-slip fault representing the dominant tectonic boundary between Helvetic nappes to the north and Penninic nappes to the south. Current strike-slip earthquakes, however, occur predominantly in a narrow, east-west striking cluster located in the Rawil depression north of the Rhone-Simplon Fault. Recent earthquake swarms near Sion and Sierre in 2011 and 2016, on the other hand, indicate seismically active dextral faults close to the Rhone valley. The region north and south of the Rhone-Simplon Fault is one of the most seismically active regions in

  19. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  20. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Bhaskaran, A.; Chen, S.; Chowdhury, F. R.; Meisenhelter, S.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2010-12-01

    Currently the SCEDC archives continuous and triggered data from nearly 5000 data channels from 425 SCSN recorded stations, processing and archiving an average of 12,000 earthquakes each year. The SCEDC provides public access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP and DHI. This poster will describe the most significant developments at the SCEDC in the past year. Updated hardware: ● The SCEDC has more than doubled its waveform file storage capacity by migrating to 2 TB disks. New data holdings: ● Waveform data: Beginning Jan 1, 2010 the SCEDC began continuously archiving all high-sample-rate strong-motion channels. All seismic channels recorded by SCSN are now continuously archived and available at SCEDC. ● Portable data from El Mayor Cucapah 7.2 sequence: Seismic waveforms from portable stations installed by researchers (contributed by Elizabeth Cochran, Jamie Steidl, and Octavio Lazaro-Mancilla) have been added to the archive and are accessible through STP either as continuous data or associated with events in the SCEDC earthquake catalog. This additional data will help SCSN analysts and researchers improve event locations from the sequence. ● Real time GPS solutions from El Mayor Cucapah 7.2 event: Three component 1Hz seismograms of California Real Time Network (CRTN) GPS stations, from the April 4, 2010, magnitude 7.2 El Mayor-Cucapah earthquake are available in SAC format at the SCEDC. These time series were created by Brendan Crowell, Yehuda Bock, the project PI, and Mindy Squibb at SOPAC using data from the CRTN. The El Mayor-Cucapah earthquake demonstrated definitively the power of real-time high-rate GPS data: they measure dynamic displacements directly, they do not clip and they are also able to detect the permanent (coseismic) surface deformation. ● Triggered data from the Quake Catcher Network (QCN) and Community Seismic Network (CSN): The SCEDC in

  1. Impact of the 2001 Tohoku-oki earthquake to Tokyo Metropolitan area observed by the Metropolitan Seismic Observation network (MeSO-net)

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Hayashi, H.; Nakagawa, S.; Sakai, S.; Honda, R.; Kasahara, K.; Obara, K.; Aketagawa, T.; Kimura, H.; Sato, H.; Okaya, D. A.

    2011-12-01

    The March 11, 2011 Tohoku-oki earthquake brought a great impact to the Tokyo metropolitan area in both seismological aspect and seismic risk management although Tokyo is located 340 km from the epicenter. The event generated very strong ground motion even in the metropolitan area and resulted severe requifaction in many places of Kanto district. National and local governments have started to discuss counter measurement for possible seismic risks in the area taking account for what they learned from the Tohoku-oki event which is much larger than ever experienced in Japan Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (M8.0) and the 1923 Kanto earthquake (M7.9). An M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that an M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. We will discuss the main results that are obtained in the respective fields which have been integrated to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area; the project has been much improved after the Tohoku event. In order to image seismic structure beneath the Metropolitan Tokyo area we have developed Metropolitan Seismic Observation network (MeSO-net; Hirata et al., 2009). We have installed 296 seismic stations every few km (Kasahara et al., 2011). We conducted seismic

  2. UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking

    USGS Publications Warehouse

    Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying

    2013-01-01

    The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.

  3. Fault model of the M7.1 intraslab earthquake on April 7 following the 2011 Great Tohoku earthquake (M9.0) estimated by the dense GPS network data

    NASA Astrophysics Data System (ADS)

    Miura, S.; Ohta, Y.; Ohzono, M.; Kita, S.; Iinuma, T.; Demachi, T.; Tachibana, K.; Nakayama, T.; Hirahara, S.; Suzuki, S.; Sato, T.; Uchida, N.; Hasegawa, A.; Umino, N.

    2011-12-01

    We propose a source fault model of the large intraslab earthquake with M7.1 deduced from a dense GPS network. The coseismic displacements obtained by GPS data analysis clearly show the spatial pattern specific to intraslab earthquakes not only in the horizontal components but also the vertical ones. A rectangular fault with uniform slip was estimated by a non-linear inversion approach. The results indicate that the simple rectangular fault model can explain the overall features of the observations. The amount of moment released is equivalent to Mw 7.17. The hypocenter depth of the main shock estimated by the Japan Meteorological Agency is slightly deeper than the neutral plane between down-dip compression (DC) and down-dip extension (DE) stress zones of the double-planed seismic zone. This suggests that the depth of the neutral plane was deepened by the huge slip of the 2011 M9.0 Tohoku earthquake, and the rupture of the thrust M7.1 earthquake was initiated at that depth, although more investigations are required to confirm this idea. The estimated fault plane has an angle of ~60 degrees from the surface of subducting Pacific plate. It is consistent with the hypothesis that intraslab earthquakes are thought to be reactivation of the preexisting hydrated weak zones made in bending process of oceanic plates around outer-rise regions.

  4. MX Resident Engineer Networking Guide.

    DTIC Science & Technology

    1982-04-01

    FIGURES 6 INTRODUCTION ............................................................ 11 Background Approach Purpose Scope 2 SYSTEM OVERVIEW...RESIDENT ENGINEER NETWORKING GUIDE 1 INTRODUCTION Background The Network Analysis System (NAS) is not a new planning method. It has been used for more than...DEVELOPMENT RUN DATE 19MAY7’ 2359MRS S U M M A R I P A Y N L N T S S T A I I M E N I PROJECT START jAUG ?? PROJECT EXR 4 L1 SAMPLE PNOPLEM *ACTIVITT-YN

  5. A gene network simulator to assess reverse engineering algorithms.

    PubMed

    Di Camillo, Barbara; Toffolo, Gianna; Cobelli, Claudio

    2009-03-01

    In the context of reverse engineering of biological networks, simulators are helpful to test and compare the accuracy of different reverse-engineering approaches in a variety of experimental conditions. A novel gene-network simulator is presented that resembles some of the main features of transcriptional regulatory networks related to topology, interaction among regulators of transcription, and expression dynamics. The simulator generates network topology according to the current knowledge of biological network organization, including scale-free distribution of the connectivity and clustering coefficient independent of the number of nodes in the network. It uses fuzzy logic to represent interactions among the regulators of each gene, integrated with differential equations to generate continuous data, comparable to real data for variety and dynamic complexity. Finally, the simulator accounts for saturation in the response to regulation and transcription activation thresholds and shows robustness to perturbations. It therefore provides a reliable and versatile test bed for reverse engineering algorithms applied to microarray data. Since the simulator describes regulatory interactions and expression dynamics as two distinct, although interconnected aspects of regulation, it can also be used to test reverse engineering approaches that use both microarray and protein-protein interaction data in the process of learning. A first software release is available at http://www.dei.unipd.it/~dicamill/software/netsim as an R programming language package.

  6. A Predictive Approach to Network Reverse-Engineering

    NASA Astrophysics Data System (ADS)

    Wiggins, Chris

    2005-03-01

    A central challenge of systems biology is the ``reverse engineering" of transcriptional networks: inferring which genes exert regulatory control over which other genes. Attempting such inference at the genomic scale has only recently become feasible, via data-intensive biological innovations such as DNA microrrays (``DNA chips") and the sequencing of whole genomes. In this talk we present a predictive approach to network reverse-engineering, in which we integrate DNA chip data and sequence data to build a model of the transcriptional network of the yeast S. cerevisiae capable of predicting the response of genes in unseen experiments. The technique can also be used to extract ``motifs,'' sequence elements which act as binding sites for regulatory proteins. We validate by a number of approaches and present comparison of theoretical prediction vs. experimental data, along with biological interpretations of the resulting model. En route, we will illustrate some basic notions in statistical learning theory (fitting vs. over-fitting; cross- validation; assessing statistical significance), highlighting ways in which physicists can make a unique contribution in data- driven approaches to reverse engineering.

  7. Detecting earthquakes over a seismic network using single-station similarity measures

    NASA Astrophysics Data System (ADS)

    Bergen, Karianne J.; Beroza, Gregory C.

    2018-06-01

    New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected moveout. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to 2 weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalogue (including 95 per cent of the catalogue events), and less than 1 per cent of these candidate events are false detections.

  8. Detecting Earthquakes over a Seismic Network using Single-Station Similarity Measures

    NASA Astrophysics Data System (ADS)

    Bergen, Karianne J.; Beroza, Gregory C.

    2018-03-01

    New blind waveform-similarity-based detection methods, such as Fingerprint and Similarity Thresholding (FAST), have shown promise for detecting weak signals in long-duration, continuous waveform data. While blind detectors are capable of identifying similar or repeating waveforms without templates, they can also be susceptible to false detections due to local correlated noise. In this work, we present a set of three new methods that allow us to extend single-station similarity-based detection over a seismic network; event-pair extraction, pairwise pseudo-association, and event resolution complete a post-processing pipeline that combines single-station similarity measures (e.g. FAST sparse similarity matrix) from each station in a network into a list of candidate events. The core technique, pairwise pseudo-association, leverages the pairwise structure of event detections in its network detection model, which allows it to identify events observed at multiple stations in the network without modeling the expected move-out. Though our approach is general, we apply it to extend FAST over a sparse seismic network. We demonstrate that our network-based extension of FAST is both sensitive and maintains a low false detection rate. As a test case, we apply our approach to two weeks of continuous waveform data from five stations during the foreshock sequence prior to the 2014 Mw 8.2 Iquique earthquake. Our method identifies nearly five times as many events as the local seismicity catalog (including 95% of the catalog events), and less than 1% of these candidate events are false detections.

  9. Dense Ocean Floor Network for Earthquakes and Tsunamis; DONET/ DONET2, Part2 -Development and data application for the mega thrust earthquakes around the Nankai trough-

    NASA Astrophysics Data System (ADS)

    Kaneda, Y.; Kawaguchi, K.; Araki, E.; Matsumoto, H.; Nakamura, T.; Nakano, M.; Kamiya, S.; Ariyoshi, K.; Baba, T.; Ohori, M.; Hori, T.; Takahashi, N.; Kaneko, S.; Donet Research; Development Group

    2010-12-01

    Yoshiyuki Kaneda Katsuyoshi Kawaguchi*, Eiichiro Araki*, Shou Kaneko*, Hiroyuki Matsumoto*, Takeshi Nakamura*, Masaru Nakano*, Shinichirou Kamiya*, Keisuke Ariyoshi*, Toshitaka Baba*, Michihiro Ohori*, Narumi Takakahashi*, and Takane Hori** * Earthquake and Tsunami Research Project for Disaster Prevention, Leading Project , Japan Agency for Marine-Earth Science and Technology (JAMSTEC) **Institute for Research on Earth Evolution, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) DONET (Dense Ocean Floor Network for Earthquakes and Tsunamis) is the real time monitoring system of the Tonankai seismogenic zones around the Nankai trough southwestern Japan. We were starting to develop DONET to perform real time monitoring of crustal activities over there and the advanced early warning system. DONET will provide important and useful data to understand the Nankai trough maga thrust earthquake seismogenic zones and to improve the accuracy of the earthquake recurrence cycle simulation. Details of DONET concept are as follows. 1) Redundancy, Extendable function and advanced maintenance system using the looped cable system, junction boxes and the ROV/AUV. DONET has 20 observatories and incorporated in a double land stations concept. Also, we are developed ROV for the 10km cable extensions and heavy weight operations. 2) Multi kinds of sensors to observe broad band phenomena such as long period tremors, very low frequency earthquakes and strong motions of mega thrust earthquakes over M8: Therefore, sensors such as a broadband seismometer, an accelerometer, a hydrophone, a precise pressure gauge, a differential pressure gauge and a thermometer are equipped with each observatory in DONET. 3) For speedy detections, evaluations and notifications of earthquakes and tsunamis: DONET system will be deployed around the Tonankai seismogenic zone. 4) Provide data of ocean floor crustal deformations derived from pressure sensors: Simultaneously, the development of data

  10. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  11. Adaptive Optimization of Aircraft Engine Performance Using Neural Networks

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Long, Theresa W.

    1995-01-01

    Preliminary results are presented on the development of an adaptive neural network based control algorithm to enhance aircraft engine performance. This work builds upon a previous National Aeronautics and Space Administration (NASA) effort known as Performance Seeking Control (PSC). PSC is an adaptive control algorithm which contains a model of the aircraft's propulsion system which is updated on-line to match the operation of the aircraft's actual propulsion system. Information from the on-line model is used to adapt the control system during flight to allow optimal operation of the aircraft's propulsion system (inlet, engine, and nozzle) to improve aircraft engine performance without compromising reliability or operability. Performance Seeking Control has been shown to yield reductions in fuel flow, increases in thrust, and reductions in engine fan turbine inlet temperature. The neural network based adaptive control, like PSC, will contain a model of the propulsion system which will be used to calculate optimal control commands on-line. Hopes are that it will be able to provide some additional benefits above and beyond those of PSC. The PSC algorithm is computationally intensive, it is valid only at near steady-state flight conditions, and it has no way to adapt or learn on-line. These issues are being addressed in the development of the optimal neural controller. Specialized neural network processing hardware is being developed to run the software, the algorithm will be valid at steady-state and transient conditions, and will take advantage of the on-line learning capability of neural networks. Future plans include testing the neural network software and hardware prototype against an aircraft engine simulation. In this paper, the proposed neural network software and hardware is described and preliminary neural network training results are presented.

  12. National Earthquake Hazards Reduction Program; time to expand

    USGS Publications Warehouse

    Steinbrugge, K.V.

    1990-01-01

    All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role? 

  13. Feasibility study of earthquake early warning (EEW) in Hawaii

    USGS Publications Warehouse

    Thelen, Weston A.; Hotovec-Ellis, Alicia J.; Bodin, Paul

    2016-09-30

    The effects of earthquake shaking on the population and infrastructure across the State of Hawaii could be catastrophic, and the high seismic hazard in the region emphasizes the likelihood of such an event. Earthquake early warning (EEW) has the potential to give several seconds of warning before strong shaking starts, and thus reduce loss of life and damage to property. The two approaches to EEW are (1) a network approach (such as ShakeAlert or ElarmS) where the regional seismic network is used to detect the earthquake and distribute the alarm and (2) a local approach where a critical facility has a single seismometer (or small array) and a warning system on the premises.The network approach, also referred to here as ShakeAlert or ElarmS, uses the closest stations within a regional seismic network to detect and characterize an earthquake. Most parameters used for a network approach require observations on multiple stations (typically 3 or 4), which slows down the alarm time slightly, but the alarms are generally more reliable than with single-station EEW approaches. The network approach also benefits from having stations closer to the source of any potentially damaging earthquake, so that alarms can be sent ahead to anyone who subscribes to receive the notification. Thus, a fully implemented ShakeAlert system can provide seconds of warning for both critical facilities and general populations ahead of damaging earthquake shaking.The cost to implement and maintain a fully operational ShakeAlert system is high compared to a local approach or single-station solution, but the benefits of a ShakeAlert system would be felt statewide—the warning times for strong shaking are potentially longer for most sources at most locations.The local approach, referred to herein as “single station,” uses measurements from a single seismometer to assess whether strong earthquake shaking can be expected. Because of the reliance on a single station, false alarms are more common than

  14. Foreshocks and aftershocks locations of the 2014 Pisagua, N. Chile earthquake: history of a megathrust earthquake nucleation

    NASA Astrophysics Data System (ADS)

    Fuenzalida Velasco, Amaya; Rietbrock, Andreas; Tavera, Hernando; Ryder, Isabelle; Ruiz, Sergio; Thomas, Reece; De Angelis, Silvio; Bondoux, Francis

    2015-04-01

    The April 2014 Mw 8.1 Pisagua earthquake occurred in the Northern Chile seismic gap: a region of the South American subduction zone lying between Arica city and the Mejillones Peninsula. It is believed that this part of the subduction zone has not experienced a large earthquake since 1877. Thanks to the identification of this seismic gap, the north of Chile was well instrumented before the Pisagua earthquake, including the Integrated Plate boundary Observatory Chile (IPOC) network and the Chilean local network installed by the Centro Sismologico Nacional (CSN). These instruments were able to record the full foreshock and aftershock sequences, allowing a unique opportunity to study the nucleation process of large megathrust earthquakes. To improve azimuthal coverage of the Pisagua seismic sequence, after the earthquake, in collaboration with the Instituto Geofisico del Peru (IGP) we installed a temporary seismic network in south of Peru. The network comprised 12 short-period stations located in the coastal area between Moquegua and Tacna and they were operative from 1st May 2014. We also installed three stations on the slopes of the Ticsiani volcano to monitor any possible change in volcanic activity following the Pisagua earthquake. In this work we analysed the continuous seismic data recorded by CSN and IPOC networks from 1 March to 30 June to obtain the catalogue of the sequence, including foreshocks and aftershocks. Using an automatic algorithm based in STA/LTA we obtained the picks for P and S waves. Association in time and space defined the events and computed an initial location using Hypo71 and the 1D local velocity model. More than 11,000 events were identified with this method for the whole period, but we selected the best resolved events that include more than 7 observed arrivals with at least 2 S picks of them, to relocate these events using NonLinLoc software. For the main events of the sequence we carefully estimate event locations and we obtained

  15. Neural network application to comprehensive engine diagnostics

    NASA Technical Reports Server (NTRS)

    Marko, Kenneth A.

    1994-01-01

    We have previously reported on the use of neural networks for detection and identification of faults in complex microprocessor controlled powertrain systems. The data analyzed in those studies consisted of the full spectrum of signals passing between the engine and the real-time microprocessor controller. The specific task of the classification system was to classify system operation as nominal or abnormal and to identify the fault present. The primary concern in earlier work was the identification of faults, in sensors or actuators in the powertrain system as it was exercised over its full operating range. The use of data from a variety of sources, each contributing some potentially useful information to the classification task, is commonly referred to as sensor fusion and typifies the type of problems successfully addressed using neural networks. In this work we explore the application of neural networks to a different diagnostic problem, the diagnosis of faults in newly manufactured engines and the utility of neural networks for process control.

  16. Initiatives to Reduce Earthquake Risk of Developing Countries

    NASA Astrophysics Data System (ADS)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

  17. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  18. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    USGS Publications Warehouse

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  19. Landslide mobility and connectivity with fluvial networks during earthquakes

    NASA Astrophysics Data System (ADS)

    Clark, M. K.; West, A. J.; Li, G.; Roback, K.; Zekkos, D.

    2016-12-01

    In some tectonically active mountain belts, coseismic landslide events displace sediment volumes equal to long-term erosion rates when averaged over typical seismic cycles. However, the contribution of landsliding to total erosional budgets depends critically on the export of landslide debris, which in turn is thought to depend on connectivity of landslides with fluvial channels and the sediment transport capacity of fluvial systems. From the 2015 Mw7.8 Gorkha event in central Nepal, we present connectivity data based on a mapped inventory of nearly 25,000 landslides and compare these results to those from the 2008 Mw7.9 Wenchuan earthquake in China. Landslide runout length in Nepal scales with landslide volume, and has a strong association with slope, elevation and relief. Connectivity is greatest for larger landslides in the high-relief, high-elevation part of the High Himalaya, suggesting that these slope failures may have the most immediate impact on sediment dynamics and cascading hazards, such as landslide reactivation by monsoon rainfall and outburst floods that pose immediate threat to communities far down stream. Although more rare than landslides at lower elevation, large high-elevation landslides that cause outburst flooding due to failure of landslide dams in the upper reaches of large Himalayan rivers may also enhance river incision downstream. The overall high fluvial connectivity (i.e. high percentage of landslide volumes directly intersecting the stream network) of coseismic landsliding in the Gorkha event suggests coupling between the earthquake cycle and sediment/geochemical budgets of fluvial systems in the steep topography of the Himalaya.

  20. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  1. Historical earthquake research in Austria

    NASA Astrophysics Data System (ADS)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  2. Seismic Strong Motion Array Project (SSMAP) to Record Future Large Earthquakes in the Nicoya Peninsula area, Costa Rica

    NASA Astrophysics Data System (ADS)

    Simila, G.; McNally, K.; Quintero, R.; Segura, J.

    2006-12-01

    The seismic strong motion array project (SSMAP) for the Nicoya Peninsula in northwestern Costa Rica is composed of 10 13 sites including Geotech A900/A800 accelerographs (three-component), Ref-Teks (three- component velocity), and Kinemetric Episensors. The main objectives of the array are to: 1) record and locate strong subduction zone mainshocks [and foreshocks, "early aftershocks", and preshocks] in Nicoya Peninsula, at the entrance of the Nicoya Gulf, and in the Papagayo Gulf regions of Costa Rica, and 2) record and locate any moderate to strong upper plate earthquakes triggered by a large subduction zone earthquake in the above regions. Our digital accelerograph array has been deployed as part of our ongoing research on large earthquakes in conjunction with the Earthquake and Volcano Observatory (OVSICORI) at the Universidad Nacional in Costa Rica. The country wide seismographic network has been operating continuously since the 1980's, with the first earthquake bulletin published more than 20 years ago, in 1984. The recording of seismicity and strong motion data for large earthquakes along the Middle America Trench (MAT) has been a major research project priority over these years, and this network spans nearly half the time of a "repeat cycle" (50 years) for large (Ms 7.5- 7.7) earthquakes beneath the Nicoya Peninsula, with the last event in 1950. Our long time co-collaborators include the seismology group OVSICORI, with coordination for this project by Dr. Ronnie Quintero and Mr. Juan Segura. Numerous international investigators are also studying this region with GPS and seismic stations (US, Japan, Germany, Switzerland, etc.). Also, there are various strong motion instruments operated by local engineers, for building purposes and mainly concentrated in the population centers of the Central Valley. The major goal of our project is to contribute unique scientific information pertaining to a large subduction zone earthquake and its related seismic activity when

  3. Applying Model Based Systems Engineering to NASA's Space Communications Networks

    NASA Technical Reports Server (NTRS)

    Bhasin, Kul; Barnes, Patrick; Reinert, Jessica; Golden, Bert

    2013-01-01

    System engineering practices for complex systems and networks now require that requirement, architecture, and concept of operations product development teams, simultaneously harmonize their activities to provide timely, useful and cost-effective products. When dealing with complex systems of systems, traditional systems engineering methodology quickly falls short of achieving project objectives. This approach is encumbered by the use of a number of disparate hardware and software tools, spreadsheets and documents to grasp the concept of the network design and operation. In case of NASA's space communication networks, since the networks are geographically distributed, and so are its subject matter experts, the team is challenged to create a common language and tools to produce its products. Using Model Based Systems Engineering methods and tools allows for a unified representation of the system in a model that enables a highly related level of detail. To date, Program System Engineering (PSE) team has been able to model each network from their top-level operational activities and system functions down to the atomic level through relational modeling decomposition. These models allow for a better understanding of the relationships between NASA's stakeholders, internal organizations, and impacts to all related entities due to integration and sustainment of existing systems. Understanding the existing systems is essential to accurate and detailed study of integration options being considered. In this paper, we identify the challenges the PSE team faced in its quest to unify complex legacy space communications networks and their operational processes. We describe the initial approaches undertaken and the evolution toward model based system engineering applied to produce Space Communication and Navigation (SCaN) PSE products. We will demonstrate the practice of Model Based System Engineering applied to integrating space communication networks and the summary of its

  4. Assessing Urban Streets Network Vulnerability against Earthquake Using GIS - Case Study: 6TH Zone of Tehran

    NASA Astrophysics Data System (ADS)

    Rastegar, A.

    2017-09-01

    Great earthquakes cause huge damages to human life. Street networks vulnerability makes the rescue operation to encounter serious difficulties especially at the first 72 hours after the incident. Today, physical expansion and high density of great cities, due to narrow access roads, large distance from medical care centers and location at areas with high seismic risk, will lead to a perilous and unpredictable situation in case of the earthquake. Zone # 6 of Tehran, with 229,980 population (3.6% of city population) and 20 km2 area (3.2% of city area), is one of the main municipal zones of Tehran (Iran center of statistics, 2006). Major land-uses, like ministries, embassies, universities, general hospitals and medical centers, big financial firms and so on, manifest the high importance of this region on local and national scale. In this paper, by employing indexes such as access to medical centers, street inclusion, building and population density, land-use, PGA and building quality, vulnerability degree of street networks in zone #6 against the earthquake is calculated through overlaying maps and data in combination with IHWP method and GIS. This article concludes that buildings alongside the streets with high population and building density, low building quality, far to rescue centers and high level of inclusion represent high rate of vulnerability, compared with other buildings. Also, by moving on from north to south of the zone, the vulnerability increases. Likewise, highways and streets with substantial width and low building and population density hold little values of vulnerability.

  5. Towards Integrated Marmara Strong Motion Network

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Erdik, M.; Safak, E.; Ansal, A.; Ozel, O.; Alcik, H.; Mert, A.; Kafadar, N.; Korkmaz, A.; Kurtulus, A.

    2009-04-01

    Istanbul has a 65% chance of having a magnitude 7 or above earthquake within the next 30 years. As part of the preparations for the future earthquake, strong motion networks have been installed in and around Istanbul. The Marmara Strong Motion Network, operated by the Department of Earthquake Engineering of Kandilli Observatory and Earthquake Research Institute, encompasses permanent systems outlined below. It is envisaged that the networks will be run by a single entity responsible for technical management and maintanence, as well as for data management, archiving and dissemination through dedicated web-based interfaces. • Istanbul Earthquake Rapid Response and Early Warning System - IERREWS (one hundred 18-bit accelerometers for rapid response; ten 24-bit accelerometers for early warning) • IGDAŞ Gas Shutoff Network (100 accelerometers to be installed in 2010 and integrated with IERREWS) • Structural Monitoring Arrays - Fatih Sultan Mehmet Suspension Bridge (1200m-long suspension bridge across the Bosphorus, five 3-component accelerometers + GPS sensors) - Hagia Sophia Array (1500-year-old historical edifice, 9 accelerometers) - Süleymaniye Mosque Array (450-year-old historical edifice,9 accelerometers) - Fatih Mosque Array (237-year-old historical edifice, 9 accelerometers) - Kanyon Building Array (high-rise office building, 5 accelerometers) - Isbank Tower Array (high-rise office building, 5 accelerometers) - ENRON Array (power generation facility, 4 acelerometers) - Mihrimah Sultan Mosque Array (450-year-old historical edifice,9 accelerometers + tiltmeters, to be installed in 2009) - Sultanahmet Mosque Array, (390-year-old historical edifice, 9 accelerometers + tiltmeters, to be installed in 2009) • Special Arrays - Atakoy Vertical Array (four 3-component accelerometers at 25, 50, 75, and 150 m depths) - Marmara Tube Tunnel (1400 m long submerged tunnel, 128 ch. accelerometric data, 24 ch. strain data, to be installed in 2010) - Air-Force Academy

  6. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  7. The Quake-Catcher Network: An Innovative Community-Based Seismic Network

    NASA Astrophysics Data System (ADS)

    Saltzman, J.; Cochran, E. S.; Lawrence, J. F.; Christensen, C. M.

    2009-12-01

    The Quake-Catcher Network (QCN) is a volunteer computing seismic network that engages citizen scientists, teachers, and museums to participate in the detection of earthquakes. In less than two years, the network has grown to over 1000 participants globally and continues to expand. QCN utilizes Micro-Electro-Mechanical System (MEMS) accelerometers, in laptops and external to desktop computers, to detect moderate to large earthquakes. One goal of the network is to involve K-12 classrooms and museums by providing sensors and software to introduce participants to seismology and community-based scientific data collection. The Quake-Catcher Network provides a unique opportunity to engage participants directly in the scientific process, through hands-on activities that link activities and outcomes to their daily lives. Partnerships with teachers and museum staff are critical to growth of the Quake Catcher Network. Each participating institution receives a MEMS accelerometer to connect, via USB, to a computer that can be used for hands-on activities and to record earthquakes through a distributed computing system. We developed interactive software (QCNLive) that allows participants to view sensor readings in real time. Participants can also record earthquakes and download earthquake data that was collected by their sensor or other QCN sensors. The Quake-Catcher Network combines research and outreach to improve seismic networks and increase awareness and participation in science-based research in K-12 schools.

  8. Rumours about the Po Valley earthquakes of 20th and 29th May 2012

    NASA Astrophysics Data System (ADS)

    La Longa, Federica; Crescimbene, Massimo; Camassi, Romano; Nostro, Concetta

    2013-04-01

    The history of rumours is as old as human history. Even in remote antiquity, rumours, gossip and hoax were always in circulation - in good or bad faith - to influence human affairs. Today with the development of mass media, rise of the internet and social networks, rumours are ubiquitous. The earthquakes, because of their characteristics of strong emotional impact and unpredictability, are among the natural events that more cause the birth and the spread of rumours. For this reason earthquakes that occurred in the Po valley the 20th and 29th May 2012 generated and still continue to generate a wide variety of rumours regarding issues related to the earthquake, its effects, the possible causes, future predictions. For this reason, as occurred during the L'Aquila earthquake sequence in 2009, following the events of May 2012 in Emilia Romagna was created a complex initiative training and information that at various stages between May and September 2012, involved population, partly present in the camp, and then the school staff of the municipalities affected by the earthquake. This experience has been organized and managed by the Department of Civil Protection (DPC), the National Institute of Geophysics and Volcanology (INGV), the Emilia Romagna region in collaboration with the Network of University Laboratories for Earthquake Engineering (RELUIS), the Health Service Emilia Romagna Regional and voluntary organizations of civil protection in the area. Within this initiative, in the period June-September 2012 were collected and catalogued over 240 rumours. In this work rumours of the Po Valley are studied in their specific characteristics and strategies and methods to fight them are also discussed. This work of collection and discussion of the rumours was particularly important to promote good communication strategies and to fight the spreading of the rumours. Only in this way it was possible to create a full intervention able to supporting both the local institutions and

  9. The Southern Kansas Seismic Network

    NASA Astrophysics Data System (ADS)

    Terra, F. M.

    2015-12-01

    Historically aseismic Harper and Sumner counties in Southern Kansas experienced a dramatic increase in seismicity beginning in early 2014, coincident with the development of new oil production in the Mississippi Lime Play. In order to better understand the potential relationships between seismicity and oil development, the USGS installed a real-time telemetered seismic network in cooperation with the Kansas Geological Survey, the Kansas Corporation Commission, the Kansas Department of Health and Environment, Harper County, and the Oklahoma Geological Survey. The network began operation in March 2014 with an initial deployment of 5 NetQuakes accelerometers and by July 2014 had expanded to include 10 broadband sites. The network currently has 14 stations, all with accelerometers and 12 with broadband seismometers. The network has interstation spacing of 15 - 25 km and typical azimuthal gap of 80 for well-located events. Data are continuously streamed to IRIS at 200 samples per second from most sites. Earthquake locations are augmented with additional stations from the USGS National Network, Oklahoma Geological Survey Seismic Network, Kansas Seismic Monitoring Network and the Enid Oklahoma Network. Since the spring of 2014 over 7500 earthquakes have been identified with data from this network, 1400 of which have been manually timed and cataloged. Focal depths for earthquakes typically range between 2 and 7 km. The catalog is available at earthquake.usgs.gov/earthquakes/search/ under network code 'Ismpkansas'. The network recorded the largest known earthquake in Harper County, Mw 4.3, on October 2, 2014 and in Sumner County, Mw 4.9, on November 12, 2014. Recorded ground motions at the epicenter of the October earthquake were 0.70 g (PGA) and 12 cm/s (PGV). These high ground motion values agree with near-source recordings made by other USGS temporary deployments in the U. S. midcontinent, indicating a significant shaking hazard from such shallow, moderate

  10. Modeling complexity in engineered infrastructure system: Water distribution network as an example

    NASA Astrophysics Data System (ADS)

    Zeng, Fang; Li, Xiang; Li, Ke

    2017-02-01

    The complex topology and adaptive behavior of infrastructure systems are driven by both self-organization of the demand and rigid engineering solutions. Therefore, engineering complex systems requires a method balancing holism and reductionism. To model the growth of water distribution networks, a complex network model was developed following the combination of local optimization rules and engineering considerations. The demand node generation is dynamic and follows the scaling law of urban growth. The proposed model can generate a water distribution network (WDN) similar to reported real-world WDNs on some structural properties. Comparison with different modeling approaches indicates that a realistic demand node distribution and co-evolvement of demand node and network are important for the simulation of real complex networks. The simulation results indicate that the efficiency of water distribution networks is exponentially affected by the urban growth pattern. On the contrary, the improvement of efficiency by engineering optimization is limited and relatively insignificant. The redundancy and robustness, on another aspect, can be significantly improved through engineering methods.

  11. Real-time seismic monitoring of the integrated cape girardeau bridge array and recorded earthquake response

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. The bridge was designed for a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and at surface and downhole free-field arrays of the bridge. The paper also presents the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to assess the performance of the bridge, to check design parameters, including the comparison of dynamic characteristics with actual response, and to better design future similar bridges. Preliminary analyses of ambient and low amplitude small earthquake data reveal specific response characteristics of the bridge and the free-field. There is evidence of coherent tower, cable, deck interaction that sometimes results in amplified ambient motions. Motions at the lowest tri-axial downhole accelerometers on both MO and IL sides are practically free from any feedback from the bridge. Motions at the mid-level and surface downhole accelerometers are influenced significantly by feedback due to amplified ambient motions of the bridge. Copyright ASCE 2006.

  12. Sizing up earthquake damage: Differing points of view

    USGS Publications Warehouse

    Hough, S.; Bolen, A.

    2007-01-01

    When a catastrophic event strikes an urban area, many different professionals hit the ground running. Emergency responders respond, reporters report, and scientists and engineers collect and analyze data. Journalists and scientists may share interest in these events, but they have very different missions. To a journalist, earthquake damage is news. To a scientist or engineer, earthquake damage represents a valuable source of data that can help us understand how strongly the ground shook as well as how particular structures responded to the shaking.

  13. The University of Michigan's Computer-Aided Engineering Network.

    ERIC Educational Resources Information Center

    Atkins, D. E.; Olsen, Leslie A.

    1986-01-01

    Presents an overview of the Computer-Aided Engineering Network (CAEN) of the University of Michigan. Describes its arrangement of workstations, communication networks, and servers. Outlines the factors considered in hardware and software decision making. Reviews the program's impact on students. (ML)

  14. MyShake: Smartphone-based detection and analysis of Oklahoma earthquakes

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.

    2016-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing (myshake.berkeley.edu). It uses the accelerometer data from phones to detect earthquake-like motion, and then uploads triggers and waveform data to a server for aggregation of the results. Since the public release in Feb 2016, more than 200,000 android-phone owners have installed the app, and the global network has recorded more than 300 earthquakes. In Oklahoma, there are about 200 active users each day providing enough data for the network to detect earthquakes and for us to perform analysis of the events. MyShake has recorded waveform data for M2.6 to M5.8 earthquakes in the state. For the September 3, 2016, M5.8 earthquake 14 phones detected the event and we can use the waveforms to determine event characteristics. MyShake data provides a location 3.95 km from the ANSS location and a magnitude of 5.7. We can also use MyShake data to estimate a stress drop of 7.4 MPa. MyShake is still a rapidly expanding network that has the ability to grow by thousands of stations/phones in a matter of hours as public interest increases. These initial results suggest that the data will be useful for a variety of scientific studies of induced seismicity phenomena in Oklahoma as well as having the potential to provide earthquake early warning in the future.

  15. Earthquake: Game-based learning for 21st century STEM education

    NASA Astrophysics Data System (ADS)

    Perkins, Abigail Christine

    To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having

  16. 78 FR 7464 - Large Scale Networking (LSN) ; Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-01

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN) ; Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination...://www.nitrd.gov/nitrdgroups/index.php?title=Joint_Engineering_Team_ (JET)#title. SUMMARY: The JET...

  17. Awareness and understanding of earthquake hazards at school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  18. Strong ground motion from the michoacan, Mexico, earthquake.

    PubMed

    Anderson, J G; Bodin, P; Brune, J N; Prince, J; Singh, S K; Quaas, R; Onate, M

    1986-09-05

    The network of strong motion accelerographs in Mexico includes instruments that were installed, under an international cooperative research program, in sites selected for the high potenial of a large earthquake. The 19 September 1985 earthquake (magnitude 8.1) occurred in a seismic gap where an earthquake was expected. As a result, there is an excellent descripton of the ground motions that caused the disaster.

  19. Populating the Advanced National Seismic System Comprehensive Earthquake Catalog

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Perry, M. R.; Andrews, J. R.; Withers, M. M.; Hellweg, M.; Kim, W. Y.; Shiro, B.; West, M. E.; Storchak, D. A.; Pankow, K. L.; Huerfano Moreno, V. A.; Gee, L. S.; Wolfe, C. J.

    2016-12-01

    The U.S. Geological Survey maintains a repository of earthquake information produced by networks in the Advanced National Seismic System with additional data from the ISC-GEM catalog and many non-U.S. networks through their contributions to the National Earthquake Information Center PDE bulletin. This Comprehensive Catalog (ComCat) provides a unified earthquake product while preserving attribution and contributor information. ComCat contains hypocenter and magnitude information with supporting phase arrival-time and amplitude measurements (when available). Higher-level products such as focal mechanisms, earthquake slip models, "Did You Feel It?" reports, ShakeMaps, PAGER impact estimates, earthquake summary posters, and tectonic summaries are also included. ComCat is updated as new events are processed and the catalog can be accesed at http://earthquake.usgs.gov/earthquakes/search/. Throughout the past few years, a concentrated effort has been underway to expand ComCat by integrating global and regional historic catalogs. The number of earthquakes in ComCat has more than doubled in the past year and it presently contains over 1.6 million earthquake hypocenters. We will provide an overview of catalog contents and a detailed description of numerous tools and semi-automated quality-control procedures developed to uncover errors including systematic magnitude biases, missing time periods, duplicate postings for the same events, and incorrectly associated events.

  20. Environmentally Friendly Solution to Ground Hazards in Design of Bridges in Earthquake Prone Areas Using Timber Piles

    NASA Astrophysics Data System (ADS)

    Sadeghi, H.

    2015-12-01

    Bridges are major elements of infrastructure in all societies. Their safety and continued serviceability guaranties the transportation and emergency access in urban and rural areas. However, these important structures are subject to earthquake induced damages in structure and foundations. The basic approach to the proper support of foundations are a) distribution of imposed loads to foundation in a way they can resist those loads without excessive settlement and failure; b) modification of foundation ground with various available methods; and c) combination of "a" and "b". The engineers has to face the task of designing the foundations meeting all safely and serviceability criteria but sometimes when there are numerous environmental and financial constrains, the use of some traditional methods become inevitable. This paper explains the application of timber piles to improve ground resistance to liquefaction and to secure the abutments of short to medium length bridges in an earthquake/liquefaction prone area in Bohol Island, Philippines. The limitations of using the common ground improvement methods (i.e., injection, dynamic compaction) because of either environmental or financial concerns along with the abundance of timber in the area made the engineers to use a network of timber piles behind the backwalls of the bridge abutments. The suggested timber pile network is simulated by numerical methods and its safety is examined. The results show that the compaction caused by driving of the piles and bearing capacity provided by timbers reduce the settlement and lateral movements due to service and earthquake induced loads.

  1. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

    USGS Publications Warehouse

    Celsi, R.; Wolfinbarger, M.; Wald, D.

    2005-01-01

    The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

  2. 77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET). SUMMARY: The JET, established in 1997, provides for information sharing among Federal...

  3. 78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO..._Engineering_Team_ (JET)#title. SUMMARY: The JET, established in 1997, provides for information sharing among...

  4. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  5. GIS Based System for Post-Earthquake Crisis Managment Using Cellular Network

    NASA Astrophysics Data System (ADS)

    Raeesi, M.; Sadeghi-Niaraki, A.

    2013-09-01

    Earthquakes are among the most destructive natural disasters. Earthquakes happen mainly near the edges of tectonic plates, but they may happen just about anywhere. Earthquakes cannot be predicted. Quick response after disasters, like earthquake, decreases loss of life and costs. Massive earthquakes often cause structures to collapse, trapping victims under dense rubble for long periods of time. After the earthquake and destroyed some areas, several teams are sent to find the location of the destroyed areas. The search and rescue phase usually is maintained for many days. Time reduction for surviving people is very important. A Geographical Information System (GIS) can be used for decreasing response time and management in critical situations. Position estimation in short period of time time is important. This paper proposes a GIS based system for post-earthquake disaster management solution. This system relies on several mobile positioning methods such as cell-ID and TA method, signal strength method, angel of arrival method, time of arrival method and time difference of arrival method. For quick positioning, the system can be helped by any person who has a mobile device. After positioning and specifying the critical points, the points are sent to a central site for managing the procedure of quick response for helping. This solution establishes a quick way to manage the post-earthquake crisis.

  6. Unraveling earthquake stresses: Insights from dynamically triggered and induced earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Alfaro-Diaz, R. A.

    2017-12-01

    Induced seismicity, earthquakes caused by anthropogenic activity, has more than doubled in the last several years resulting from practices related to oil and gas production. Furthermore, large earthquakes have been shown to promote the triggering of other events within two fault lengths (static triggering), due to static stresses caused by physical movement along the fault, and also remotely from the passage of seismic waves (dynamic triggering). Thus, in order to understand the mechanisms for earthquake failure, we investigate regions where natural, induced, and dynamically triggered events occur, and specifically target Oklahoma. We first analyze data from EarthScope's USArray Transportable Array (TA) and local seismic networks implementing an optimized (STA/LTA) detector in order to develop local detection and earthquake catalogs. After we identify triggered events through statistical analysis, and perform a stress analysis to gain insight on the stress-states leading to triggered earthquake failure. We use our observations to determine the role of different transient stresses in contributing to natural and induced seismicity by comparing these stresses to regional stress orientation. We also delineate critically stressed regions of triggered seismicity that may indicate areas susceptible to earthquake hazards associated with sustained fluid injection in provinces of induced seismicity. Anthropogenic injection and extraction activity can alter the stress state and fluid flow within production basins. By analyzing the stress release of these ancient faults caused by dynamic stresses, we may be able to determine if fluids are solely responsible for increased seismic activity in induced regions.

  7. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  8. Introduction to the special issue on the 2004 Parkfield earthquake and the Parkfield earthquake prediction experiment

    USGS Publications Warehouse

    Harris, R.A.; Arrowsmith, J.R.

    2006-01-01

    The 28 September 2004 M 6.0 Parkfield earthquake, a long-anticipated event on the San Andreas fault, is the world's best recorded earthquake to date, with state-of-the-art data obtained from geologic, geodetic, seismic, magnetic, and electrical field networks. This has allowed the preearthquake and postearthquake states of the San Andreas fault in this region to be analyzed in detail. Analyses of these data provide views into the San Andreas fault that show a complex geologic history, fault geometry, rheology, and response of the nearby region to the earthquake-induced ground movement. Although aspects of San Andreas fault zone behavior in the Parkfield region can be modeled simply over geological time frames, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake indicate that predicting the fine details of future earthquakes is still a challenge. Instead of a deterministic approach, forecasting future damaging behavior, such as that caused by strong ground motions, will likely continue to require probabilistic methods. However, the Parkfield Earthquake Prediction Experiment and the 2004 Parkfield earthquake have provided ample data to understand most of what did occur in 2004, culminating in significant scientific advances.

  9. Computer systems for automatic earthquake detection

    USGS Publications Warehouse

    Stewart, S.W.

    1974-01-01

    U.S Geological Survey seismologists in Menlo park, California, are utilizing the speed, reliability, and efficiency of minicomputers to monitor seismograph stations and to automatically detect earthquakes. An earthquake detection computer system, believed to be the only one of its kind in operation, automatically reports about 90 percent of all local earthquakes recorded by a network of over 100 central California seismograph stations. The system also monitors the stations for signs of malfunction or abnormal operation. Before the automatic system was put in operation, all of the earthquakes recorded had to be detected by manually searching the records, a time-consuming process. With the automatic detection system, the stations are efficiently monitored continuously. 

  10. DEFINING THE PLAYERS IN HIGHER-ORDER NETWORKS: PREDICTIVE MODELING FOR REVERSE ENGINEERING FUNCTIONAL INFLUENCE NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Costa, Michelle N.; Stevens, S.L.

    A difficult problem that is currently growing rapidly due to the sharp increase in the amount of high-throughput data available for many systems is that of determining useful and informative causative influence networks. These networks can be used to predict behavior given observation of a small number of components, predict behavior at a future time point, or identify components that are critical to the functioning of the system under particular conditions. In these endeavors incorporating observations of systems from a wide variety of viewpoints can be particularly beneficial, but has often been undertaken with the objective of inferring networks thatmore » are generally applicable. The focus of the current work is to integrate both general observations and measurements taken for a particular pathology, that of ischemic stroke, to provide improved ability to produce useful predictions of systems behavior. A number of hybrid approaches have recently been proposed for network generation in which the Gene Ontology is used to filter or enrich network links inferred from gene expression data through reverse engineering methods. These approaches have been shown to improve the biological plausibility of the inferred relationships determined, but still treat knowledge-based and machine-learning inferences as incommensurable inputs. In this paper, we explore how further improvements may be achieved through a full integration of network inference insights achieved through application of the Gene Ontology and reverse engineering methods with specific reference to the construction of dynamic models of transcriptional regulatory networks. We show that integrating two approaches to network construction, one based on reverse-engineering from conditional transcriptional data, one based on reverse-engineering from in situ hybridization data, and another based on functional associations derived from Gene Ontology, using probabilities can improve results of clustering as

  11. The next new Madrid earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, W.

    1988-01-01

    Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less

  12. Earthquake locations determined by the Southern Alaska seismograph network for October 1971 through May 1989

    USGS Publications Warehouse

    Fogleman, Kent A.; Lahr, John C.; Stephens, Christopher D.; Page, Robert A.

    1993-01-01

    This report describes the instrumentation and evolution of the U.S. Geological Survey’s regional seismograph network in southern Alaska, provides phase and hypocenter data for seismic events from October 1971 through May 1989, reviews the location methods used, and discusses the completeness of the catalog and the accuracy of the computed hypocenters. Included are arrival time data for explosions detonated under the Trans-Alaska Crustal Transect (TACT) in 1984 and 1985.The U.S. Geological Survey (USGS) operated a regional network of seismographs in southern Alaska from 1971 to the mid 1990s. The principal purpose of this network was to record seismic data to be used to precisely locate earthquakes in the seismic zones of southern Alaska, delineate seismically active faults, assess seismic risks, document potential premonitory earthquake phenomena, investigate current tectonic deformation, and study the structure and physical properties of the crust and upper mantle. A task fundamental to all of these goals was the routine cataloging of parameters for earthquakes located within and adjacent to the seismograph network.The initial network of 10 stations, 7 around Cook Inlet and 3 near Valdez, was installed in 1971. In subsequent summers additions or modifications to the network were made. By the fall of 1973, 26 stations extended from western Cook Inlet to eastern Prince William Sound, and 4 stations were located to the east between Cordova and Yakutat. A year later 20 additional stations were installed. Thirteen of these were placed along the eastern Gulf of Alaska with support from the National Oceanic and Atmospheric Administration (NOAA) under the Outer Continental Shelf Environmental Assessment Program to investigate the seismicity of the outer continental shelf, a region of interest for oil exploration. Since then the region covered by the network remained relatively fixed while efforts were made to make the stations more reliable through improved electronic

  13. Did we really #prayfornepal? Instagram posts as a massive digital funeral in Nepal earthquake aftermath

    NASA Astrophysics Data System (ADS)

    Kamil, P. I.; Pratama, A. J.; Hidayatulloh, A.

    2016-05-01

    Social media has been part of our daily life for years, and now it has become a treasure trove of data for social scientists to mine. Using our own data mining engine we downloaded 1500 Instagram posts related to the Nepal earthquake in April 2015, a disaster which caused tremendous losses counted in human lives and infrastructures. We predicted that the social media will be a place where people respond and express themselves emotionally in light of a disaster of such massive scale, a "megadeath" event. We ended up with data on 1017 posts tracked with the hashtag #prayfornepal, consisting of the post's date, time, geolocation, image, post ID, username and ID, caption, and hashtag. We categorized the posts into 7 categories and found that most of the photos (30,29%) are related to Nepal but not directly related to the disasters, which reflects death imprint, one of psychosocial responses after a megadeath event. Other analyses were done to compare each photo category, including geo-location, hashtag network and caption network which will be visualized with ArcGIS, NodeXL, Gephi, and our own word cloud engine to examine other digital reactions to Nepal Earthquake in Instagram. This study can give an overview of how community reacts to a disaster in digital world and utilize it for disaster response and awareness.

  14. Exploring geological and socio-demographic factors associated with under-five mortality in the Wenchuan earthquake using neural network model.

    PubMed

    Hu, Yi; Wang, Jinfeng; Li, Xiaohong; Ren, Dan; Driskell, Luke; Zhu, Jun

    2012-01-01

    On 12 May 2008, a devastating earthquake occurred in Sichuan Province, China, taking tens of thousands of lives and destroying the homes of millions of people. Among the large number of dead or missing were children, particularly children aged less than five years old, a fact which drew significant media attention. To obtain relevant information specifically to aid further studies and future preventative measures, a neural network model was proposed to explore some geological and socio-demographic factors associated with earthquake-related child mortality. Sensitivity analysis showed that topographic slope (mean 35.76%), geomorphology (mean 24.18%), earthquake intensity (mean 13.68%), and average income (mean 11%) had great contributions to child mortality. These findings could provide some clues to researchers for further studies and to policy makers in deciding how and where preventive measures and corresponding policies should be implemented in the reconstruction of communities.

  15. Earthquakes and Volcanic Processes at San Miguel Volcano, El Salvador, Determined from a Small, Temporary Seismic Network

    NASA Astrophysics Data System (ADS)

    Hernandez, S.; Schiek, C. G.; Zeiler, C. P.; Velasco, A. A.; Hurtado, J. M.

    2008-12-01

    The San Miguel volcano lies within the Central American volcanic chain in eastern El Salvador. The volcano has experienced at least 29 eruptions with Volcano Explosivity Index (VEI) of 2. Since 1970, however, eruptions have decreased in intensity to an average of VEI 1, with the most recent eruption occurring in 2002. Eruptions at San Miguel volcano consist mostly of central vent and phreatic eruptions. A critical challenge related to the explosive nature of this volcano is to understand the relationships between precursory surface deformation, earthquake activity, and volcanic activity. In this project, we seek to determine sub-surface structures within and near the volcano, relate the local deformation to these structures, and better understand the hazard that the volcano presents in the region. To accomplish these goals, we deployed a six station, broadband seismic network around San Miguel volcano in collaboration with researchers from Servicio Nacional de Estudios Territoriales (SNET). This network operated continuously from 23 March 2007 to 15 January 2008 and had a high data recovery rate. The data were processed to determine earthquake locations, magnitudes, and, for some of the larger events, focal mechanisms. We obtained high precision locations using a double-difference approach and identified at least 25 events near the volcano. Ongoing analysis will seek to identify earthquake types (e.g., long period, tectonic, and hybrid events) that occurred in the vicinity of San Miguel volcano. These results will be combined with radar interferometric measurements of surface deformation in order to determine the relationship between surface and subsurface processes at the volcano.

  16. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    A comprehensive data set of selected teleseismic P-wave arrivals and local-network P- and S-wave arrivals from large earthquakes occurring at all depths within a small section of the central Aleutians is used to examine the general problem of earthquake location in island arcs. Reference hypocenters for this special data set are determined for shallow earthquakes from local-network data and for deep earthquakes from combined local and teleseismic data by joint inversion for structure and location. The high-velocity lithospheric slab beneath the central Aleutians may displace hypocenters that are located using spherically symmetric Earth models; the amount of displacement depends on the position of the earthquakes with respect to the slab and on whether local or teleseismic data are used to locate the earthquakes. Hypocenters for trench and intermediate-depth events appear to be minimally biased by the effects of slab structure on rays to teleseismic stations. However, locations of intermediate-depth events based on only local data are systematically displaced southwards, the magnitude of the displacement being proportional to depth. Shallow-focus events along the main thrust zone, although well located using only local-network data, are severely shifted northwards and deeper, with displacements as large as 50 km, by slab effects on teleseismic travel times. Hypocenters determined by a method that utilizes seismic ray tracing through a three-dimensional velocity model of the subduction zone, derived by thermal modeling, are compared to results obtained by the method of joint hypocenter determination (JHD) that formally assumes a laterally homogeneous velocity model over the source region and treats all raypath anomalies as constant station corrections to the travel-time curve. The ray-tracing method has the theoretical advantage that it accounts for variations in travel-time anomalies within a group of events distributed over a sizable region of a dipping, high

  17. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  18. Detailed source process of the 2007 Tocopilla earthquake.

    NASA Astrophysics Data System (ADS)

    Peyrat, S.; Madariaga, R.; Campos, J.; Asch, G.; Favreau, P.; Bernard, P.; Vilotte, J.

    2008-05-01

    We investigated the detail rupture process of the Tocopilla earthquake (Mw 7.7) of the 14 November 2007 and of the main aftershocks that occurred in the southern part of the North Chile seismic gap using strong motion data. The earthquake happen in the middle of the permanent broad band and strong motion network IPOC newly installed by GFZ and IPGP, and of a digital strong-motion network operated by the University of Chile. The Tocopilla earthquake is the last large thrust subduction earthquake that occurred since the major Iquique 1877 earthquake which produced a destructive tsunami. The Arequipa (2001) and Antofagasta (1995) earthquakes already ruptured the northern and southern parts of the gap, and the intraplate intermediate depth Tarapaca earthquake (2005) may have changed the tectonic loading of this part of the Peru-Chile subduction zone. For large earthquakes, the depth of the seismic rupture is bounded by the depth of the seismogenic zone. What controls the horizontal extent of the rupture for large earthquakes is less clear. Factors that influence the extent of the rupture include fault geometry, variations of material properties and stress heterogeneities inherited from the previous ruptures history. For subduction zones where structures are not well known, what may have stopped the rupture is not obvious. One crucial problem raised by the Tocopilla earthquake is to understand why this earthquake didn't extent further north, and at south, what is the role of the Mejillones peninsula that seems to act as a barrier. The focal mechanism was determined using teleseismic waveforms inversion and with a geodetic analysis (cf. Campos et al.; Bejarpi et al., in the same session). We studied the detailed source process using the strong motion data available. This earthquake ruptured the interplate seismic zone over more than 150 km and generated several large aftershocks, mainly located south of the rupture area. The strong-motion data show clearly two S

  19. Recorded earthquake responses from the integrated seismic monitoring network of the Atwood Building, Anchorage, Alaska

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    An integrated seismic monitoring system with a total of 53 channels of accelerometers is now operating in and at the nearby free-field site of the 20-story steel-framed Atwood Building in highly seismic Anchorage, Alaska. The building has a single-story basement and a reinforced concrete foundation without piles. The monitoring system comprises a 32-channel structural array and a 21-channel site array. Accelerometers are deployed on 10 levels of the building to assess translational, torsional, and rocking motions, interstory drift (displacement) between selected pairs of adjacent floors, and average drift between floors. The site array, located approximately a city block from the building, comprises seven triaxial accelerometers, one at the surface and six in boreholes ranging in depths from 15 to 200 feet (???5-60 meters). The arrays have already recorded low-amplitude shaking responses of the building and the site caused by numerous earthquakes at distances ranging from tens to a couple of hundred kilometers. Data from an earthquake that occurred 186 km away traces the propagation of waves from the deepest borehole to the roof of the building in approximately 0.5 seconds. Fundamental structural frequencies [0.58 Hz (NS) and 0.47 Hz (EW)], low damping percentages (2-4%), mode coupling, and beating effects are identified. The fundamental site frequency at approximately 1.5 Hz is close to the second modal frequencies (1.83 Hz NS and 1.43 EW) of the building, which may cause resonance of the building. Additional earthquakes prove repeatability of these characteristics; however, stronger shaking may alter these conclusions. ?? 2006, Earthquake Engineering Research Institute.

  20. Application of τc*Pd for identifying damaging earthquakes for earthquake early warning

    NASA Astrophysics Data System (ADS)

    Huang, P. L.; Lin, T. L.; Wu, Y. M.

    2014-12-01

    Earthquake Early Warning System (EEWS) is an effective approach to mitigate earthquake damage. In this study, we used the seismic record by the Kiban Kyoshin network (KiK-net), because it has dense station coverage and co-located borehole strong-motion seismometers along with the free-surface strong-motion seismometers. We used inland earthquakes with moment magnitude (Mw) from 5.0 to 7.3 between 1998 and 2012. We choose 135 events and 10950 strong ground accelerograms recorded by the 696 strong ground accelerographs. Both the free-surface and the borehole data are used to calculate τc and Pd, respectively. The results show that τc*Pd has a good correlation with PGV and is a robust parameter for assessing the potential of damaging earthquake. We propose the value of τc*Pd determined from seconds after the arrival of P wave could be a threshold for the on-site type of EEW.

  1. Remote Imaging of Earthquake Characteristics Along Oceanic Transforms

    NASA Astrophysics Data System (ADS)

    Cleveland, M.; Ammon, C. J.

    2014-12-01

    Compared with subduction and continental transform systems, many characteristics of oceanic transform faults (OTF) are better defined (first-order structure and composition, thermal properties, etc.). Still, many aspects of earthquake behavior along OTFs remain poorly understood as a result of their relative remoteness. But the substantial aseismic deformation (averaging roughly 85%) that occurs along OTFs and the implied interaction of aseismic with seismic deformation is an opportunity to explore fundamental earthquake nucleation and rupture processes. However, the study of OTF earthquake properties is not easy because these faults are often located in remote regions, lacking nearby seismic networks. Thus, many standard network-based seismic approaches are infeasible, but some can be adapted to the effort. For example, double-difference methods applied to cross-correlation measured Rayleigh wave time shifts is an effective tool to provide greatly improved relative epicentroid locations, origin-time shifts, and relative event magnitudes for earthquakes in remote regions. The same comparative waveform measurements can provide insight into rupture directivity of the larger OTF events. In this study, we calculate improved relative earthquake locations and magnitudes of earthquakes along the Blanco Fracture Zone in the northeast Pacific Ocean and compare and contrast that work with a study of the more remote Menard Transform Fault (MTF), located in the southeast Pacific Ocean. For the Blanco, we work exclusively with Rayleigh (R1) observations exploiting the dense networks in the northern hemisphere. For the MTF, we combine R1 with Love (G1) observations to map and to analyze the distribution of strong asperities along this remote, 200-km-long fault. Specifically, we attempt to better define the relationship between observed near-transform normal and vertical strike-slip earthquakes in the vicinity of the MTF. We test our ability to use distant observations (the

  2. G-FAST Early Warning Potential for Great Earthquakes in Chile

    NASA Astrophysics Data System (ADS)

    Crowell, B.; Schmidt, D. A.; Baker, B. I.; Bodin, P.; Vidale, J. E.

    2016-12-01

    The importance of GNSS-based earthquake early warning for modeling large earthquakes has been studied extensively over the past decade and several such systems are currently under development. In the Pacific Northwest, we have developed the G-FAST GNSS-based earthquake early warning module for eventual inclusion in the US West-Coast wide ShakeAlert system. We have also created a test system that allows us to replay past and synthetic earthquakes to identify problems with both the network architecture and the algorithms. Between 2010 and 2016, there have been seven M > 8 earthquakes across the globe, of which three struck offshore Chile; the 27 February 2010 Mw 8.8 Maule, the 1 April 2014 Mw 8.2 Iquique, and the 16 September 2015 Mw 8.3 Illapel. Subsequent to these events, the Chilean national GNSS network operated by the Centro Sismologico Nacional (http://www.sismologia.cl/) greatly expanded to over 150 continuous GNSS stations, providing the best recordings of great earthquakes with GNSS outside of Japan. Here we report on retrospective G-FAST performance for those three great earthquakes in Chile. We discuss the interplay of location errors, latency, and data completeness with respect to the precision and timing of G-FAST earthquake source alerts as well as the computational demands of the system.

  3. CellNet: network biology applied to stem cell engineering.

    PubMed

    Cahan, Patrick; Li, Hu; Morris, Samantha A; Lummertz da Rocha, Edroaldo; Daley, George Q; Collins, James J

    2014-08-14

    Somatic cell reprogramming, directed differentiation of pluripotent stem cells, and direct conversions between differentiated cell lineages represent powerful approaches to engineer cells for research and regenerative medicine. We have developed CellNet, a network biology platform that more accurately assesses the fidelity of cellular engineering than existing methodologies and generates hypotheses for improving cell derivations. Analyzing expression data from 56 published reports, we found that cells derived via directed differentiation more closely resemble their in vivo counterparts than products of direct conversion, as reflected by the establishment of target cell-type gene regulatory networks (GRNs). Furthermore, we discovered that directly converted cells fail to adequately silence expression programs of the starting population and that the establishment of unintended GRNs is common to virtually every cellular engineering paradigm. CellNet provides a platform for quantifying how closely engineered cell populations resemble their target cell type and a rational strategy to guide enhanced cellular engineering. Copyright © 2014 Elsevier Inc. All rights reserved.

  4. CellNet: Network Biology Applied to Stem Cell Engineering

    PubMed Central

    Cahan, Patrick; Li, Hu; Morris, Samantha A.; da Rocha, Edroaldo Lummertz; Daley, George Q.; Collins, James J.

    2014-01-01

    SUMMARY Somatic cell reprogramming, directed differentiation of pluripotent stem cells, and direct conversions between differentiated cell lineages represent powerful approaches to engineer cells for research and regenerative medicine. We have developed CellNet, a network biology platform that more accurately assesses the fidelity of cellular engineering than existing methodologies and generates hypotheses for improving cell derivations. Analyzing expression data from 56 published reports, we found that cells derived via directed differentiation more closely resemble their in vivo counterparts than products of direct conversion, as reflected by the establishment of target cell-type gene regulatory networks (GRNs). Furthermore, we discovered that directly converted cells fail to adequately silence expression programs of the starting population, and that the establishment of unintended GRNs is common to virtually every cellular engineering paradigm. CellNet provides a platform for quantifying how closely engineered cell populations resemble their target cell type and a rational strategy to guide enhanced cellular engineering. PMID:25126793

  5. Absolute earthquake locations using 3-D versus 1-D velocity models below a local seismic network: example from the Pyrenees

    NASA Astrophysics Data System (ADS)

    Theunissen, T.; Chevrot, S.; Sylvander, M.; Monteiller, V.; Calvet, M.; Villaseñor, A.; Benahmed, S.; Pauchet, H.; Grimaud, F.

    2018-03-01

    Local seismic networks are usually designed so that earthquakes are located inside them (primary azimuthal gap <<180°) and close to the seismic stations (0-100 km). With these local or near-regional networks (0°-5°), many seismological observatories still routinely locate earthquakes using 1-D velocity models. Moving towards 3-D location algorithms requires robust 3-D velocity models. This work takes advantage of seismic monitoring spanning more than 30 yr in the Pyrenean region. We investigate the influence of a well-designed 3-D model with station corrections including basins structure and the geometry of the Mohorovicic discontinuity on earthquake locations. In the most favourable cases (GAP < 180° and distance to the first station lower than 15 km), results using 1-D velocity models are very similar to 3-D results. The horizontal accuracy in the 1-D case can be higher than in the 3-D case if lateral variations in the structure are not properly resolved. Depth is systematically better resolved in the 3-D model even on the boundaries of the seismic network (GAP > 180° and distance to the first station higher than 15 km). Errors on velocity models and accuracy of absolute earthquake locations are assessed based on a reference data set made of active seismic, quarry blasts and passive temporary experiments. Solutions and uncertainties are estimated using the probabilistic approach of the NonLinLoc (NLLoc) software based on Equal Differential Time. Some updates have been added to NLLoc to better focus on the final solution (outlier exclusion, multiscale grid search, S-phases weighting). Errors in the probabilistic approach are defined to take into account errors on velocity models and on arrival times. The seismicity in the final 3-D catalogue is located with a horizontal uncertainty of about 2.0 ± 1.9 km and a vertical uncertainty of about 3.0 ± 2.0 km.

  6. The Role of Computer Networks in Aerospace Engineering.

    ERIC Educational Resources Information Center

    Bishop, Ann Peterson

    1994-01-01

    Presents selected results from an empirical investigation into the use of computer networks in aerospace engineering based on data from a national mail survey. The need for user-based studies of electronic networking is discussed, and a copy of the questionnaire used in the survey is appended. (Contains 46 references.) (LRW)

  7. A Viscoelastic earthquake simulator with application to the San Francisco Bay region

    USGS Publications Warehouse

    Pollitz, Fred F.

    2009-01-01

    Earthquake simulation on synthetic fault networks carries great potential for characterizing the statistical patterns of earthquake occurrence. I present an earthquake simulator based on elastic dislocation theory. It accounts for the effects of interseismic tectonic loading, static stress steps at the time of earthquakes, and postearthquake stress readjustment through viscoelastic relaxation of the lower crust and mantle. Earthquake rupture initiation and termination are determined with a Coulomb failure stress criterion and the static cascade model. The simulator is applied to interacting multifault systems: one, a synthetic two-fault network, and the other, a fault network representative of the San Francisco Bay region. The faults are discretized both along strike and along dip and can accommodate both strike slip and dip slip. Stress and seismicity functions are evaluated over 30,000 yr trial time periods, resulting in a detailed statistical characterization of the fault systems. Seismicity functions such as the coefficient of variation and a- and b-values exhibit systematic patterns with respect to simple model parameters. This suggests that reliable estimation of the controlling parameters of an earthquake simulator is a prerequisite to the interpretation of its output in terms of seismic hazard.

  8. Conversion of Local and Surface-Wave Magnitudes to Moment Magnitude for Earthquakes in the Chinese Mainland

    NASA Astrophysics Data System (ADS)

    Li, X.; Gao, M.

    2017-12-01

    The magnitude of an earthquake is one of its basic parameters and is a measure of its scale. It plays a significant role in seismology and earthquake engineering research, particularly in the calculations of the seismic rate and b value in earthquake prediction and seismic hazard analysis. However, several current types of magnitudes used in seismology research, such as local magnitude (ML), surface wave magnitude (MS), and body-wave magnitude (MB), have a common limitation, which is the magnitude saturation phenomenon. Fortunately, the problem of magnitude saturation was solved by a formula for calculating the seismic moment magnitude (MW) based on the seismic moment, which describes the seismic source strength. Now the moment magnitude is very commonly used in seismology research. However, in China, the earthquake scale is primarily based on local and surface-wave magnitudes. In the present work, we studied the empirical relationships between moment magnitude (MW) and local magnitude (ML) as well as surface wave magnitude (MS) in the Chinese Mainland. The China Earthquake Networks Center (CENC) ML catalog, China Seismograph Network (CSN) MS catalog, ANSS Comprehensive Earthquake Catalog (ComCat), and Global Centroid Moment Tensor (GCMT) are adopted to regress the relationships using the orthogonal regression method. The obtained relationships are as follows: MW=0.64+0.87MS; MW=1.16+0.75ML. Therefore, in China, if the moment magnitude of an earthquake is not reported by any agency in the world, we can use the equations mentioned above for converting ML to MW and MS to MW. These relationships are very important, because they will allow the China earthquake catalogs to be used more effectively for seismic hazard analysis, earthquake prediction, and other seismology research. We also computed the relationships of and (where Mo is the seismic moment) by linear regression using the Global Centroid Moment Tensor. The obtained relationships are as follows: logMo=18

  9. Repeating Earthquakes Following an Mw 4.4 Earthquake Near Luther, Oklahoma

    NASA Astrophysics Data System (ADS)

    Clements, T.; Keranen, K. M.; Savage, H. M.

    2015-12-01

    An Mw 4.4 earthquake on April 16, 2013 near Luther, OK was one of the earliest M4+ earthquakes in central Oklahoma, following the Prague sequence in 2011. A network of four local broadband seismometers deployed within a day of the Mw 4.4 event, along with six Oklahoma netquake stations, recorded more than 500 aftershocks in the two weeks following the Luther earthquake. Here we use HypoDD (Waldhauser & Ellsworth, 2000) and waveform cross-correlation to obtain precise aftershock locations. The location uncertainty, calculated using the SVD method in HypoDD, is ~15 m horizontally and ~ 35 m vertically. The earthquakes define a near vertical, NE-SW striking fault plane. Events occur at depths from 2 km to 3.5 km within the granitic basement, with a small fraction of events shallower, near the sediment-basement interface. Earthquakes occur within a zone of ~200 meters thickness on either side of the best-fitting fault surface. We use an equivalency class algorithm to identity clusters of repeating events, defined as event pairs with median three-component correlation > 0.97 across common stations (Aster & Scott, 1993). Repeating events occur as doublets of only two events in over 50% of cases; overall, 41% of earthquakes recorded occur as repeating events. The recurrence intervals for the repeating events range from minutes to days, with common recurrence intervals of less than two minutes. While clusters occur in tight dimensions, commonly of 80 m x 200 m, aftershocks occur in 3 distinct ~2km x 2km-sized patches along the fault. Our analysis suggests that with rapidly deployed local arrays, the plethora of ~Mw 4 earthquakes occurring in Oklahoma and Southern Kansas can be used to investigate the earthquake rupture process and the role of damage zones.

  10. Response of a 14-story Anchorage, Alaska, building in 2002 to two close earthquakes and two distant Denali fault earthquakes

    USGS Publications Warehouse

    Celebi, M.

    2004-01-01

    The recorded responses of an Anchorage, Alaska, building during four significant earthquakes that occurred in 2002 are studied. Two earthquakes, including the 3 November 2002 M7.9 Denali fault earthquake, with epicenters approximately 275 km from the building, generated long trains of long-period (>1 s) surface waves. The other two smaller earthquakes occurred at subcrustal depths practically beneath Anchorage and produced higher frequency motions. These two pairs of earthquakes have different impacts on the response of the building. Higher modes are more pronounced in the building response during the smaller nearby events. The building responses indicate that the close-coupling of translational and torsional modes causes a significant beating effect. It is also possible that there is some resonance occurring due to the site frequency being close to the structural frequency. Identification of dynamic characteristics and behavior of buildings can provide important lessons for future earthquake-resistant designs and retrofit of existing buildings. ?? 2004, Earthquake Engineering Research Institute.

  11. The 2012 Mw5.6 earthquake in Sofia seismogenic zone - is it a slow earthquake

    NASA Astrophysics Data System (ADS)

    Raykova, Plamena; Solakov, Dimcho; Slavcheva, Krasimira; Simeonova, Stela; Aleksandrova, Irena

    2017-04-01

    very low rupture velocity. The low rupture velocity can mean slow-faulting, which brings to slow release of accumulated seismic energy. The slow release energy does principally little to moderate damages. Additionally wave form of the earthquake shows low frequency content of P-waves (the maximum P-wave is at 1.19 Hz) and the specific P- wave displacement spectral is characterise with not expressed spectrum plateau and corner frequency. These and other signs suggest us to the conclusion, that the 2012 Mw5.6 earthquake can be considered as types of slow earthquake, like a low frequency quake. The study is based on data from Bulgarian seismological network (NOTSSI), the local network (LSN) deployed around Kozloduy NPP and System of Accelerographs for Seismic Monitoring of Equipment and Structures (SASMES) installed in the Kozloduy NPP. NOTSSI jointly with LSN and SASMES provide reliable information for multiple studies on seismicity in regional scale.

  12. Crowd-Sourced Global Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Minson, S. E.; Brooks, B. A.; Glennie, C. L.; Murray, J. R.; Langbein, J. O.; Owen, S. E.; Iannucci, B. A.; Hauser, D. L.

    2014-12-01

    Although earthquake early warning (EEW) has shown great promise for reducing loss of life and property, it has only been implemented in a few regions due, in part, to the prohibitive cost of building the required dense seismic and geodetic networks. However, many cars and consumer smartphones, tablets, laptops, and similar devices contain low-cost versions of the same sensors used for earthquake monitoring. If a workable EEW system could be implemented based on either crowd-sourced observations from consumer devices or very inexpensive networks of instruments built from consumer-quality sensors, EEW coverage could potentially be expanded worldwide. Controlled tests of several accelerometers and global navigation satellite system (GNSS) receivers typically found in consumer devices show that, while they are significantly noisier than scientific-grade instruments, they are still accurate enough to capture displacements from moderate and large magnitude earthquakes. The accuracy of these sensors varies greatly depending on the type of data collected. Raw coarse acquisition (C/A) code GPS data are relatively noisy. These observations have a surface displacement detection threshold approaching ~1 m and would thus only be useful in large Mw 8+ earthquakes. However, incorporating either satellite-based differential corrections or using a Kalman filter to combine the raw GNSS data with low-cost acceleration data (such as from a smartphone) decreases the noise dramatically. These approaches allow detection thresholds as low as 5 cm, potentially enabling accurate warnings for earthquakes as small as Mw 6.5. Simulated performance tests show that, with data contributed from only a very small fraction of the population, a crowd-sourced EEW system would be capable of warning San Francisco and San Jose of a Mw 7 rupture on California's Hayward fault and could have accurately issued both earthquake and tsunami warnings for the 2011 Mw 9 Tohoku-oki, Japan earthquake.

  13. 76 FR 11821 - Submission for OMB Review; Comment Request Survey of Principal Investigators on Earthquake...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-03

    ...: Survey of Principal Investigators on Earthquake Engineering Research Awards Made by the National Science... survey of Principal Investigators on NSF earthquake engineering research awards, including but not... NATIONAL SCIENCE FOUNDATION Submission for OMB Review; Comment Request Survey of Principal...

  14. Engineering microbial phenotypes through rewiring of genetic networks

    PubMed Central

    Rodrigues, Rui T.L.; Lee, Sangjin; Haines, Matthew

    2017-01-01

    Abstract The ability to program cellular behaviour is a major goal of synthetic biology, with applications in health, agriculture and chemicals production. Despite efforts to build ‘orthogonal’ systems, interactions between engineered genetic circuits and the endogenous regulatory network of a host cell can have a significant impact on desired functionality. We have developed a strategy to rewire the endogenous cellular regulatory network of yeast to enhance compatibility with synthetic protein and metabolite production. We found that introducing novel connections in the cellular regulatory network enabled us to increase the production of heterologous proteins and metabolites. This strategy is demonstrated in yeast strains that show significantly enhanced heterologous protein expression and higher titers of terpenoid production. Specifically, we found that the addition of transcriptional regulation between free radical induced signalling and nitrogen regulation provided robust improvement of protein production. Assessment of rewired networks revealed the importance of key topological features such as high betweenness centrality. The generation of rewired transcriptional networks, selection for specific phenotypes, and analysis of resulting library members is a powerful tool for engineering cellular behavior and may enable improved integration of heterologous protein and metabolite pathways. PMID:28369627

  15. Social Network Theory in Engineering Education

    NASA Astrophysics Data System (ADS)

    Simon, Peter A.

    Collaborative groups are important both in the learning environment of engineering education and, in the real world, the business of engineering design. Selecting appropriate individuals to form an effective group and monitoring a group's progress are important aspects of successful task performance. This exploratory study looked at using the concepts of cognitive social structures, structural balance, and centrality from social network analysis as well as the measures of emotional intelligence. The concepts were used to analyze potential team members to examine if an individual's ability to perceive emotion in others and the self and to use, understand, and manage those emotions are a factor in a group's performance. The students from a capstone design course in computer engineering were used as volunteer subjects. They were formed into groups and assigned a design exercise to determine whether and which of the above-mentioned tools would be effective in both selecting teams and predicting the quality of the resultant design. The results were inconclusive with the exception of an individual's ability to accurately perceive emotions. The instruments that were successful were the Self-Monitoring scale and the accuracy scores derived from cognitive social structures and Level IV of network levels of analysis.

  16. Services supporting collaborative alignment of engineering networks

    NASA Astrophysics Data System (ADS)

    Jansson, Kim; Uoti, Mikko; Karvonen, Iris

    2015-08-01

    Large-scale facilities such as power plants, process factories, ships and communication infrastructures are often engineered and delivered through geographically distributed operations. The competencies required are usually distributed across several contributing organisations. In these complicated projects, it is of key importance that all partners work coherently towards a common goal. VTT and a number of industrial organisations in the marine sector have participated in a national collaborative research programme addressing these needs. The main output of this programme was development of the Innovation and Engineering Maturity Model for Marine-Industry Networks. The recently completed European Union Framework Programme 7 project COIN developed innovative solutions and software services for enterprise collaboration and enterprise interoperability. One area of focus in that work was services for collaborative project management. This article first addresses a number of central underlying research themes and previous research results that have influenced the development work mentioned above. This article presents two approaches for the development of services that support distributed engineering work. Experience from use of the services is analysed, and potential for development is identified. This article concludes with a proposal for consolidation of the two above-mentioned methodologies. This article outlines the characteristics and requirements of future services supporting collaborative alignment of engineering networks.

  17. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  18. Journal of the Chinese Institute of Engineers. Special Issue: Commemoration of Chi-Chi Earthquake (II)

    NASA Astrophysics Data System (ADS)

    2002-09-01

    Contents include the following: Deep Electromagnetic Images of Seismogenic Zone of the Chi-Chi (Taiwan) Earthquake; New Techniques for Stress-Forecasting Earthquakes; Aspects of Characteristics of Near-Fault Ground Motions of the 1999 Chi-Chi (Taiwan) Earthquake; Liquefaction Damage and Related Remediation in Wufeng after the Chi-Chi Earthquake; Fines Content Effects on Liquefaction Potential Evaluation for Sites Liquefied during Chi-Chi Earthquake 1999; Damage Investigation and Liquefaction Potential Analysis of Gravelly Soil; Dynamic Characteristics of Soils in Yuan-Lin Liquefaction Area; A Preliminary Study of Earthquake Building Damage and Life Loss Due to the Chi-Chi Earthquake; Statistical Analyses of Relation between Mortality and Building Type in the 1999 Chi-Chi Earthquake; Development of an After Earthquake Disaster Shelter Evaluation Model; Posttraumatic Stress Reactions in Children and Adolescents One Year after the 1999 Taiwan Chi-Chi Earthquake; Changes or Not is the Question: the Meaning of Posttraumatic Stress Reactions One Year after the Taiwan Chi-Chi Earthquake.

  19. GEM - The Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  20. The Alaska earthquake, March 27, 1964: lessons and conclusions

    USGS Publications Warehouse

    Eckel, Edwin B.

    1970-01-01

    One of the greatest earthquakes of all time struck south-central Alaska on March 27, 1964. Strong motion lasted longer than for most recorded earthquakes, and more land surface was dislocated, vertically and horizontally, than by any known previous temblor. Never before were so many effects on earth processes and on the works of man available for study by scientists and engineers over so great an area. The seismic vibrations, which directly or indirectly caused most of the damage, were but surface manifestations of a great geologic event-the dislocation of a huge segment of the crust along a deeply buried fault whose nature and even exact location are still subjects for speculation. Not only was the land surface tilted by the great tectonic event beneath it, with resultant seismic sea waves that traversed the entire Pacific, but an enormous mass of land and sea floor moved several tens of feet horizontally toward the Gulf of Alaska. Downslope mass movements of rock, earth, and snow were initiated. Subaqueous slides along lake shores and seacoasts, near-horizontal movements of mobilized soil (“landspreading”), and giant translatory slides in sensitive clay did the most damage and provided the most new knowledge as to the origin, mechanics, and possible means of control or avoidance of such movements. The slopes of most of the deltas that slid in 1964, and that produced destructive local waves, are still as steep or steeper than they were before the earthquake and hence would be unstable or metastable in the event of another great earthquake. Rockslide avalanches provided new evidence that such masses may travel on cushions of compressed air, but a widely held theory that glaciers surge after an earthquake has not been substantiated. Innumerable ground fissures, many of them marked by copious emissions of water, caused much damage in towns and along transportation routes. Vibration also consolidated loose granular materials. In some coastal areas, local

  1. Modeling fast and slow earthquakes at various scales

    PubMed Central

    IDE, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes. PMID:25311138

  2. Modeling fast and slow earthquakes at various scales.

    PubMed

    Ide, Satoshi

    2014-01-01

    Earthquake sources represent dynamic rupture within rocky materials at depth and often can be modeled as propagating shear slip controlled by friction laws. These laws provide boundary conditions on fault planes embedded in elastic media. Recent developments in observation networks, laboratory experiments, and methods of data analysis have expanded our knowledge of the physics of earthquakes. Newly discovered slow earthquakes are qualitatively different phenomena from ordinary fast earthquakes and provide independent information on slow deformation at depth. Many numerical simulations have been carried out to model both fast and slow earthquakes, but problems remain, especially with scaling laws. Some mechanisms are required to explain the power-law nature of earthquake rupture and the lack of characteristic length. Conceptual models that include a hierarchical structure over a wide range of scales would be helpful for characterizing diverse behavior in different seismic regions and for improving probabilistic forecasts of earthquakes.

  3. Did you feel it? : citizens contribute to earthquake science

    USGS Publications Warehouse

    Wald, David J.; Dewey, James W.

    2005-01-01

    Since the early 1990s, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such “Community Internet Intensity Maps” (CIIMs) contribute greatly toward the quick assessment of the scope of an earthquake emergency and provide valuable data for earthquake research.

  4. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  5. eqMAXEL: A new automatic earthquake location algorithm implementation for Earthworm

    NASA Astrophysics Data System (ADS)

    Lisowski, S.; Friberg, P. A.; Sheen, D. H.

    2017-12-01

    A common problem with automated earthquake location systems for a local to regional scale seismic network is false triggering and false locations inside the network caused by larger regional to teleseismic distance earthquakes. This false location issue also presents a problem for earthquake early warning systems where societal impacts of false alarms can be very expensive. Towards solving this issue, Sheen et al. (2016) implemented a robust maximum-likelihood earthquake location algorithm known as MAXEL. It was shown with both synthetics and real-data for a small number of arrivals, that large regional events were easily identifiable through metrics in the MAXEL algorithm. In the summer of 2017, we collaboratively implemented the MAXEL algorithm into a fully functional Earthworm module and tested it in regions of the USA where false detections and alarming are observed. We show robust improvement in the ability of the Earthworm system to filter out regional and teleseismic events that would have falsely located inside the network using the traditional Earthworm hypoinverse solution. We also explore using different grid sizes in the implementation of the MAXEL algorithm, which was originally designed with South Korea as the target network size.

  6. Harnessing the Collective Power of Eyewitnesses for Improved Earthquake Information

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Steed, R.

    2013-12-01

    The Euro-Med Seismological Centre (EMSC) operates the second global earthquake information website (www.emsc-csem.org) which attracts 2 million visits a month from about 200 different countries. We collect information about earthquakes' effects from eyewitnesses such as online questionnaires, geolocated pics to rapidly constrain impact scenario. At the beginning, the collection was purely intended to address a scientific issue: the rapid evaluation of earthquake's impact. However, it rapidly appears that the understanding of eyewitnesses' expectations and motivations in the immediate aftermath of an earthquake was essential to optimise this data collection. Crowdsourcing information on earthquake's effects does not apply to a pre-existing community. By definition, eyewitnesses only exist once the earthquake has struck. We developed a strategy on social networks (Facebook, Google+, Twitter...) to interface with spontaneously emerging online communities of eyewitnesses. The basic idea is to create a positive feedback loop: attract eyewitnesses and engage with them by providing expected earthquake information and services, collect their observations, collate them for improved earthquake information services to attract more witnesses. We will present recent examples to illustrate how important the use of social networks is to engage with eyewitnesses especially in regions of low seismic activity where people are unaware of existing Internet resources dealing with earthquakes. A second type of information collated in our information services is derived from the real time analysis of the traffic on our website in the first minutes following an earthquake occurrence, an approach named flashsourcing. We show, using the example of the Mineral, Virginia earthquake that the arrival times of eyewitnesses of our website follows the propagation of the generated seismic waves and then, that eyewitnesses can be considered as ground motion sensors. Flashsourcing discriminates felt

  7. Principles of Biomimetic Vascular Network Design Applied to a Tissue-Engineered Liver Scaffold

    PubMed Central

    Hoganson, David M.; Pryor, Howard I.; Spool, Ira D.; Burns, Owen H.; Gilmore, J. Randall

    2010-01-01

    Branched vascular networks are a central component of scaffold architecture for solid organ tissue engineering. In this work, seven biomimetic principles were established as the major guiding technical design considerations of a branched vascular network for a tissue-engineered scaffold. These biomimetic design principles were applied to a branched radial architecture to develop a liver-specific vascular network. Iterative design changes and computational fluid dynamic analysis were used to optimize the network before mold manufacturing. The vascular network mold was created using a new mold technique that achieves a 1:1 aspect ratio for all channels. In vitro blood flow testing confirmed the physiologic hemodynamics of the network as predicted by computational fluid dynamic analysis. These results indicate that this biomimetic liver vascular network design will provide a foundation for developing complex vascular networks for solid organ tissue engineering that achieve physiologic blood flow. PMID:20001254

  8. Principles of biomimetic vascular network design applied to a tissue-engineered liver scaffold.

    PubMed

    Hoganson, David M; Pryor, Howard I; Spool, Ira D; Burns, Owen H; Gilmore, J Randall; Vacanti, Joseph P

    2010-05-01

    Branched vascular networks are a central component of scaffold architecture for solid organ tissue engineering. In this work, seven biomimetic principles were established as the major guiding technical design considerations of a branched vascular network for a tissue-engineered scaffold. These biomimetic design principles were applied to a branched radial architecture to develop a liver-specific vascular network. Iterative design changes and computational fluid dynamic analysis were used to optimize the network before mold manufacturing. The vascular network mold was created using a new mold technique that achieves a 1:1 aspect ratio for all channels. In vitro blood flow testing confirmed the physiologic hemodynamics of the network as predicted by computational fluid dynamic analysis. These results indicate that this biomimetic liver vascular network design will provide a foundation for developing complex vascular networks for solid organ tissue engineering that achieve physiologic blood flow.

  9. Reflections on Communicating Science during the Canterbury Earthquake Sequence of 2010-2011, New Zealand

    NASA Astrophysics Data System (ADS)

    Wein, A. M.; Berryman, K. R.; Jolly, G. E.; Brackley, H. L.; Gledhill, K. R.

    2015-12-01

    The 2010-2011 Canterbury Earthquake Sequence began with the 4th September 2010 Darfield earthquake (Mw 7.1). Perhaps because there were no deaths, the mood of the city and the government was that high standards of earthquake engineering in New Zealand protected us, and there was a confident attitude to response and recovery. The demand for science and engineering information was of interest but not seen as crucial to policy, business or the public. The 22nd February 2011 Christchurch earthquake (Mw 6.2) changed all that; there was a significant death toll and many injuries. There was widespread collapse of older unreinforced and two relatively modern multi-storey buildings, and major disruption to infrastructure. The contrast in the interest and relevance of the science could not have been greater compared to 5 months previously. Magnitude 5+ aftershocks over a 20 month period resulted in confusion, stress, an inability to define a recovery trajectory, major concerns about whether insurers and reinsurers would continue to provide cover, very high levels of media interest from New Zealand and around the world, and high levels of political risk. As the aftershocks continued there was widespread speculation as to what the future held. During the sequence, the science and engineering sector sought to coordinate and offer timely and integrated advice. However, other than GeoNet, the national geophysical monitoring network, there were few resources devoted to communication, with the result that it was almost always reactive. With hindsight we have identified the need to resource information gathering and synthesis, execute strategic assessments of stakeholder needs, undertake proactive communication, and develop specific information packages for the diversity of users. Overall this means substantially increased resources. Planning is now underway for the science sector to adopt the New Zealand standardised CIMS (Coordinated Incident Management System) structure for

  10. Spatial Distribution of earthquakes off the coast of Fukushima Two Years after the M9 Earthquake: the Southern Area of the 2011 Tohoku Earthquake Rupture Zone

    NASA Astrophysics Data System (ADS)

    Yamada, T.; Nakahigashi, K.; Shinohara, M.; Mochizuki, K.; Shiobara, H.

    2014-12-01

    Huge earthquakes cause vastly stress field change around the rupture zones, and many aftershocks and other related geophysical phenomenon such as geodetic movements have been observed. It is important to figure out the time-spacious distribution during the relaxation process for understanding the giant earthquake cycle. In this study, we pick up the southern rupture area of the 2011 Tohoku earthquake (M9.0). The seismicity rate keeps still high compared with that before the 2011 earthquake. Many studies using ocean bottom seismometers (OBSs) have been doing since soon after the 2011 Tohoku earthquake in order to obtain aftershock activity precisely. Here we show one of the studies at off the coast of Fukushima which is located on the southern part of the rupture area caused by the 2011 Tohoku earthquake. We deployed 4 broadband type OBSs (BBOBSs) and 12 short-period type OBSs (SOBS) in August 2012. Other 4 BBOBSs attached with absolute pressure gauges and 20 SOBSs were added in November 2012. We recovered 36 OBSs including 8 BBOBSs in November 2013. We selected 1,000 events in the vicinity of the OBS network based on a hypocenter catalog published by the Japan Meteorological Agency, and extracted the data after time corrections caused by each internal clock. Each P and S wave arrival times, P wave polarity and maximum amplitude were picked manually on a computer display. We assumed one dimensional velocity structure based on the result from an active source experiment across our network, and applied time corrections every station for removing ambiguity of the assumed structure. Then we adopted a maximum-likelihood estimation technique and calculated the hypocenters. The results show that intensive activity near the Japan Trench can be seen, while there was a quiet seismic zone between the trench zone and landward high activity zone.

  11. Traffic engineering and regenerator placement in GMPLS networks with restoration

    NASA Astrophysics Data System (ADS)

    Yetginer, Emre; Karasan, Ezhan

    2002-07-01

    In this paper we study regenerator placement and traffic engineering of restorable paths in Generalized Multipro-tocol Label Switching (GMPLS) networks. Regenerators are necessary in optical networks due to transmission impairments. We study a network architecture where there are regenerators at selected nodes and we propose two heuristic algorithms for the regenerator placement problem. Performances of these algorithms in terms of required number of regenerators and computational complexity are evaluated. In this network architecture with sparse regeneration, offline computation of working and restoration paths is studied with bandwidth reservation and path rerouting as the restoration scheme. We study two approaches for selecting working and restoration paths from a set of candidate paths and formulate each method as an Integer Linear Programming (ILP) prob-lem. Traffic uncertainty model is developed in order to compare these methods based on their robustness with respect to changing traffic patterns. Traffic engineering methods are compared based on number of additional demands due to traffic uncertainty that can be carried. Regenerator placement algorithms are also evaluated from a traffic engineering point of view.

  12. Ionospheric earthquake effects detection based on Total Electron Content (TEC) GPS Correlation

    NASA Astrophysics Data System (ADS)

    Sunardi, Bambang; Muslim, Buldan; Eka Sakya, Andi; Rohadi, Supriyanto; Sulastri; Murjaya, Jaya

    2018-03-01

    Advances in science and technology showed that ground-based GPS receiver was able to detect ionospheric Total Electron Content (TEC) disturbances caused by various natural phenomena such as earthquakes. One study of Tohoku (Japan) earthquake, March 11, 2011, magnitude M 9.0 showed TEC fluctuations observed from GPS observation network spread around the disaster area. This paper discussed the ionospheric earthquake effects detection using TEC GPS data. The case studies taken were Kebumen earthquake, January 25, 2014, magnitude M 6.2, Sumba earthquake, February 12, 2016, M 6.2 and Halmahera earthquake, February 17, 2016, M 6.1. TEC-GIM (Global Ionosphere Map) correlation methods for 31 days were used to monitor TEC anomaly in ionosphere. To ensure the geomagnetic disturbances due to solar activity, we also compare with Dst index in the same time window. The results showed anomalous ratio of correlation coefficient deviation to its standard deviation upon occurrences of Kebumen and Sumba earthquake, but not detected a similar anomaly for the Halmahera earthquake. It was needed a continous monitoring of TEC GPS data to detect the earthquake effects in ionosphere. This study giving hope in strengthening the earthquake effect early warning system using TEC GPS data. The method development of continuous TEC GPS observation derived from GPS observation network that already exists in Indonesia is needed to support earthquake effects early warning systems.

  13. Response of power systems to the San Fernando Valley earthquake of 9 February 1971. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiff, A.J.; Yao, J.T.P.

    1972-01-01

    The impact of the San Fernando Valley earthquake on electric power systems is discussed. Particular attention focused on the following three areas; (1) the effects of an earthquake on the power network in the Western States, (2) the failure of subsystems and components of the power system, and (3) the loss of power to hospitals. The report includes sections on the description and functions of major components of a power network, existing procedures to protect the network, safety devices within the system which influence the network, a summary of the effects of the San Fernando Valley earthquake on the Westernmore » States Power Network, and present efforts to reduce the network vulnerability to faults. Also included in the report are a review of design procedures and practices prior to the San Fernando Valley earthquake and descriptions of types of damage to electrical equipment, dynamic analysis of equipment failures, equipment surviving the San Fernando Valley earthquake and new seismic design specifications. In addition, some observations and insights gained during the study, which are not directly related to power systems are discussed.« less

  14. Earthquake Protection Measures for People with Disabilities

    NASA Astrophysics Data System (ADS)

    Gountromichou, C.; Kourou, A.; Kerpelis, P.

    2009-04-01

    The problem of seismic safety for people with disabilities not only exists but is also urgent and of primary importance. Working towards disability equality, Earthquake Planning and Protection Organization of Greece (E.P.P.O.) has developed an educational scheme for people with disabilities in order to guide them to develop skills to protect themselves as well as to take the appropriate safety measures before, during and after an earthquake. The framework of this initiative includes a number of actions have been already undertaken, including the following: a. Recently, the main guidelines have been published to help people who have physical, cognitive, visual, or auditory disabilities to cope with a destructive earthquake. Of great importance, in case of people with disabilities, is to be prepared for the disaster, with several measures that must be taken starting today. In the pre-earthquake period, it is important that these people, in addition to other measures, do the following: - Create a Personal Support Network The Personal Support Network should be a group of at least three trustful people that can assist the disabled person to prepare for a disastrous event and to recover after it. - Complete a Personal Assessment The environment may change after a destructive earthquake. People with disabilities are encouraged to make a list of their personal needs and their resources for meeting them in a disaster environment. b. Lectures and training seminars on earthquake protection are given for students, teachers and educators in Special Schools for disabled people, mainly for informing and familiarizing them with earthquakes and with safety measures. c. Many earthquake drills have already taken place, for each disability, in order to share good practices and lessons learned to further disaster reduction and to identify gaps and challenges. The final aim of this action is all people with disabilities to be well informed and motivated towards a culture of earthquake

  15. Source properties of earthquakes near the Salton Sea triggered by the 16 October 1999 M 7.1 Hector Mine, California, earthquake

    USGS Publications Warehouse

    Hough, S.E.; Kanamori, H.

    2002-01-01

    We analyze the source properties of a sequence of triggered earthquakes that occurred near the Salton Sea in southern California in the immediate aftermath of the M 7.1 Hector Mine earthquake of 16 October 1999. The sequence produced a number of early events that were not initially located by the regional network, including two moderate earthquakes: the first within 30 sec of the P-wave arrival and a second approximately 10 minutes after the mainshock. We use available amplitude and waveform data from these events to estimate magnitudes to be approximately 4.7 and 4.4, respectively, and to obtain crude estimates of their locations. The sequence of small events following the initial M 4.7 earthquake is clustered and suggestive of a local aftershock sequence. Using both broadband TriNet data and analog data from the Southern California Seismic Network (SCSN), we also investigate the spectral characteristics of the M 4.4 event and other triggered earthquakes using empirical Green's function (EGF) analysis. We find that the source spectra of the events are consistent with expectations for tectonic (brittle shear failure) earthquakes, and infer stress drop values of 0.1 to 6 MPa for six M 2.1 to M 4.4 events. The estimated stress drop values are within the range observed for tectonic earthquakes elsewhere. They are relatively low compared to typically observed stress drop values, which is consistent with expectations for faulting in an extensional, high heat flow regime. The results therefore suggest that, at least in this case, triggered earthquakes are associated with a brittle shear failure mechanism. This further suggests that triggered earthquakes may tend to occur in geothermal-volcanic regions because shear failure occurs at, and can be triggered by, relatively low stresses in extensional regimes.

  16. Detection of Repeating Earthquakes within the Cascadia Subduction Zone Using 2013-2014 Cascadia Initiative Amphibious Network Data

    NASA Astrophysics Data System (ADS)

    Kenefic, L.; Morton, E.; Bilek, S.

    2017-12-01

    It is well known that subduction zones create the largest earthquakes in the world, like the magnitude 9.5 Chile earthquake in 1960, or the more recent 9.1 magnitude Japan earthquake in 2011, both of which are in the top five largest earthquakes ever recorded. However, off the coast of the Pacific Northwest region of the U.S., the Cascadia subduction zone (CSZ) remains relatively quiet and modern seismic instruments have not recorded earthquakes of this size in the CSZ. The last great earthquake, a magnitude 8.7-9.2, occurred in 1700 and is constrained by written reports of the resultant tsunami in Japan and dating a drowned forest in the U.S. Previous studies have suggested the margin is most likely segmented along-strike. However, variations in frictional conditions in the CSZ fault zone are not well known. Geodetic modeling indicates that the locked seismogenic zone is likely completely offshore, which may be too far from land seismometers to adequately detect related seismicity. Ocean bottom seismometers, as part of the Cascadia Initiative Amphibious Network, were installed directly above the inferred seismogenic zone, which we use to better detect small interplate seismicity. Using the subspace detection method, this study looks to find new seismogenic zone earthquakes. This subspace detection method uses multiple previously known event templates concurrently to scan through continuous seismic data. Template events that make up the subspace are chosen from events in existing catalogs that likely occurred along the plate interface. Corresponding waveforms are windowed on the nearby Cascadia Initiative ocean bottom seismometers and coastal land seismometers for scanning. Detections that are found by the scan are similar to the template waveforms based upon a predefined threshold. Detections are then visually examined to determine if an event is present. The presence of repeating event clusters can indicate persistent seismic patches, likely corresponding to

  17. Network-wide BGP route prediction for traffic engineering

    NASA Astrophysics Data System (ADS)

    Feamster, Nick; Rexford, Jennifer

    2002-07-01

    The Internet consists of about 13,000 Autonomous Systems (AS's) that exchange routing information using the Border Gateway Protocol (BGP). The operators of each AS must have control over the flow of traffic through their network and between neighboring AS's. However, BGP is a complicated, policy-based protocol that does not include any direct support for traffic engineering. In previous work, we have demonstrated that network operators can adapt the flow of traffic in an efficient and predictable fashion through careful adjustments to the BGP policies running on their edge routers. Nevertheless, many details of the BGP protocol and decision process make predicting the effects of these policy changes difficult. In this paper, we describe a tool that predicts traffic flow at network exit points based on the network topology, the import policy associated with each BGP session, and the routing advertisements received from neighboring AS's. We present a linear-time algorithm that computes a network-wide view of the best BGP routes for each destination prefix given a static snapshot of the network state, without simulating the complex details of BGP message passing. We describe how to construct this snapshot using the BGP routing tables and router configuration files available from operational routers. We verify the accuracy of our algorithm by applying our tool to routing and configuration data from AT&T's commercial IP network. Our route prediction techniques help support the operation of large IP backbone networks, where interdomain routing is an important aspect of traffic engineering.

  18. Satellite Geodetic Constraints On Earthquake Processes: Implications of the 1999 Turkish Earthquakes for Fault Mechanics and Seismic Hazards on the San Andreas Fault

    NASA Technical Reports Server (NTRS)

    Reilinger, Robert

    2005-01-01

    Our principal activities during the initial phase of this project include: 1) Continued monitoring of postseismic deformation for the 1999 Izmit and Duzce, Turkey earthquakes from repeated GPS survey measurements and expansion of the Marmara Continuous GPS Network (MAGNET), 2) Establishing three North Anatolian fault crossing profiles (10 sitedprofile) at locations that experienced major surface-fault earthquakes at different times in the past to examine strain accumulation as a function of time in the earthquake cycle (2004), 3) Repeat observations of selected sites in the fault-crossing profiles (2005), 4) Repeat surveys of the Marmara GPS network to continue to monitor postseismic deformation, 5) Refining block models for the Marmara Sea seismic gap area to better understand earthquake hazards in the Greater Istanbul area, 6) Continuing development of models for afterslip and distributed viscoelastic deformation for the earthquake cycle. We are keeping close contact with MIT colleagues (Brad Hager, and Eric Hetland) who are developing models for S. California and for the earthquake cycle in general (Hetland, 2006). In addition, our Turkish partners at the Marmara Research Center have undertaken repeat, micro-gravity measurements at the MAGNET sites and have provided us estimates of gravity change during the period 2003 - 2005.

  19. Earthquake Source Parameters Inferred from T-Wave Observations

    NASA Astrophysics Data System (ADS)

    Perrot, J.; Dziak, R.; Lau, T. A.; Matsumoto, H.; Goslin, J.

    2004-12-01

    The seismicity of the North Atlantic Ocean has been recorded by two networks of autonomous hydrophones moored within the SOFAR channel on the flanks of the Mid-Atlantic Ridge (MAR). In February 1999, a consortium of U.S. investigators (NSF and NOAA) deployed a 6-element hydrophone array for long-term monitoring of MAR seismicity between 15o-35oN south of the Azores. In May 2002, an international collaboration of French, Portuguese, and U.S. researchers deployed a 6-element hydrophone array north of the Azores Plateau from 40o-50oN. The northern network (referred to as SIRENA) was recovered in September 2003. The low attenuation properties of the SOFAR channel for earthquake T-wave propagation results in a detection threshold reduction from a magnitude completeness level (Mc) of ˜ 4.7 for MAR events recorded by the land-based seismic networks to Mc=3.0 using hydrophone arrays. Detailed focal depth and mechanism information, however, remain elusive due to the complexities of seismo-acoustic propagation paths. Nonetheless, recent analyses (Dziak, 2001; Park and Odom, 2001) indicate fault parameter information is contained within the T-wave signal packet. We investigate this relationship further by comparing an earthquake's T-wave duration and acoustic energy to seismic magnitude (NEIC) and radiation pattern (for events M>5) from the Harvard moment-tensor catalog. First results show earthquake energy is well represented by the acoustic energy of the T-waves, however T-wave codas are significantly influenced by acoustic propagation effects and do not allow a direct determination of the seismic magnitude of the earthquakes. Second, there appears to be a correlation between T-wave acoustic energy, azimuth from earthquake source to the hydrophone, and the radiation pattern of the earthquake's SH waves. These preliminary results indicate there is a relationship between the T-wave observations and earthquake source parameters, allowing for additional insights into T

  20. Seismic Velocity Measurements at Expanded Seismic Network Sites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woolery, Edward W; Wang, Zhenming

    2005-01-01

    Structures at the Paducah Gaseous Diffusion Plant (PGDP), as well as at other locations in the northern Jackson Purchase of western Kentucky may be subjected to large far-field earthquake ground motions from the New Madrid seismic zone, as well as those from small and moderate-sized local events. The resultant ground motion a particular structure is exposed from such event will be a consequence of the earthquake magnitude, the structures' proximity to the event, and the dynamic and geometrical characteristics of the thick soils upon which they are, of necessity, constructed. This investigation evaluated the latter. Downhole and surface (i.e., refractionmore » and reflection) seismic velocity data were collected at the Kentucky Seismic and Strong-Motion Network expansion sites in the vicinity of the Paducah Gaseous Diffusion Plant (PGDP) to define the dynamic properties of the deep sediment overburden that can produce modifying effects on earthquake waves. These effects are manifested as modifications of the earthquake waves' amplitude, frequency, and duration. Each of these three ground motion manifestations is also fundamental to the assessment of secondary earthquake engineering hazards such as liquefaction.« less

  1. A parallel implementation of the network identification by multiple regression (NIR) algorithm to reverse-engineer regulatory gene networks.

    PubMed

    Gregoretti, Francesco; Belcastro, Vincenzo; di Bernardo, Diego; Oliva, Gennaro

    2010-04-21

    The reverse engineering of gene regulatory networks using gene expression profile data has become crucial to gain novel biological knowledge. Large amounts of data that need to be analyzed are currently being produced due to advances in microarray technologies. Using current reverse engineering algorithms to analyze large data sets can be very computational-intensive. These emerging computational requirements can be met using parallel computing techniques. It has been shown that the Network Identification by multiple Regression (NIR) algorithm performs better than the other ready-to-use reverse engineering software. However it cannot be used with large networks with thousands of nodes--as is the case in biological networks--due to the high time and space complexity. In this work we overcome this limitation by designing and developing a parallel version of the NIR algorithm. The new implementation of the algorithm reaches a very good accuracy even for large gene networks, improving our understanding of the gene regulatory networks that is crucial for a wide range of biomedical applications.

  2. Reflections from the interface between seismological research and earthquake risk reduction

    NASA Astrophysics Data System (ADS)

    Sargeant, S.

    2012-04-01

    Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the

  3. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  4. Leveraging geodetic data to reduce losses from earthquakes

    USGS Publications Warehouse

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    Seismic hazard assessments that are based on a variety of data and the best available science, coupled with rapid synthesis of real-time information from continuous monitoring networks to guide post-earthquake response, form a solid foundation for effective earthquake loss reduction. With this in mind, the Earthquake Hazards Program (EHP) of the U.S. Geological Survey (USGS) Natural Hazards Mission Area (NHMA) engages in a variety of undertakings, both established and emergent, in order to provide high quality products that enable stakeholders to take action in advance of and in response to earthquakes. Examples include the National Seismic Hazard Model (NSHM), development of tools for improved situational awareness such as earthquake early warning (EEW) and operational earthquake forecasting (OEF), research about induced seismicity, and new efforts to advance comprehensive subduction zone science and monitoring. Geodetic observations provide unique and complementary information directly relevant to advancing many aspects of these efforts (fig. 1). EHP scientists have long leveraged geodetic data for a range of influential studies, and they continue to develop innovative observation and analysis methods that push the boundaries of the field of geodesy as applied to natural hazards research. Given the ongoing, rapid improvement in availability, variety, and precision of geodetic measurements, considering ways to fully utilize this observational resource for earthquake loss reduction is timely and essential. This report presents strategies, and the underlying scientific rationale, by which the EHP could achieve the following outcomes: The EHP is an authoritative source for the interpretation of geodetic data and its use for earthquake loss reduction throughout the United States and its territories.The USGS consistently provides timely, high quality geodetic data to stakeholders.Significant earthquakes are better characterized by incorporating geodetic data into USGS

  5. Reconnaissance engineering geology of Sitka and vicinity, Alaska, with emphasis on evaluation of earthquake and other geologic hazards

    USGS Publications Warehouse

    Yehle, Lynn A.

    1974-01-01

    A program to study the engineering geology of most of the larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about Sitka and vicinity is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are subject to revision as further information becomes available. This report can provide broad geologic guidelines for planners and engineers during preparation of land-use plans. The use of this information should lead to minimizing future loss of life and property due to geologic hazards, especially during very large earthquakes. Landscape of Sitka and surrounding area is characterized by numerous islands and a narrow strip of gently rolling ground adjacent to rugged mountains; steep valleys and some fiords cut sharply into the mountains. A few valley floors are wide and flat and grade into moderate-sized deltas. Glaciers throughout southeastern Alaska and elsewhere became vastly enlarged during the Pleistocene Epoch. The Sitka area presumably was covered by ice several times; glaciers deeply eroded some valleys and removed fractured bedrock along some faults. The last major deglaciation occurred sometime before 10,000 years ago. Crustal rebound believed to be related to glacial melting caused land emergence at Sitka of at least 35 feet (10.7 m) relative to present sea level. Bedrock at Sitka and vicinity is composed mostly of bedded, hard, dense graywacke and some argillite. Beds strike predominantly northwest and are vertical or steeply dipping. Locally, bedded rocks are cut by dikes of fine-grained igneous rock. Host bedrock is of Jurassic and Cretaceous age. Eight types of surficial deposits of Quaternary age were recognized. Below altitudes of 3S feet (10.7 m), the dominant deposits are those of modern and elevated shores and deltas; at higher altitudes, widespread muskeg overlies a mantle of

  6. Correlations between ground motion and building damage. Engineering intensity scale applied to the San Fernando earthquake of February 1971

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, D.; Kintzer, F.C.

    1977-11-01

    The correlation between ground motion and building damage was investigated for the San Fernando earthquake of 1971. A series of iso-intensity maps was compiled to summarize the ground motion in terms of the Blume Engineering Intensity Scale (EIS). This involved the analysis of ground motion records from 62 stations in the Los Angeles area. Damage information for low-rise buildings was obtained in the form of records of loans granted by the Small Business Administration to repair earthquake damage. High-rise damage evaluations were based on direct inquiry and building inspection. Damage factors (ratio of damage repair cost to building value) weremore » calculated and summarized on contour maps. A statistical study was then undertaken to determine relationships between ground motion and damage factor. Several parameters for ground motion were considered and evaluated by means of correlation coefficients.« less

  7. MyShake - A smartphone app to detect earthquake

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.; Kwon, Y. W.

    2015-12-01

    We designed an android app that harnesses the accelerometers in personal smartphones to record earthquake-shaking data for research, hazard information and warnings. The app has the function to distinguish earthquake shakings from daily human activities based on the different patterns behind the movements. It also can be triggered by the traditional earthquake early warning (EEW) system to record for a certain amount of time to collect earthquake data. When the app is triggered by the earthquake-like movements, it sends the trigger information back to our server which contains time and location of the trigger, at the same time, it stores the waveform data on local phone first, and upload to our server later. Trigger information from multiple phones will be processed in real time on the server to find the coherent signal to confirm the earthquakes. Therefore, the app provides the basis to form a smartphone seismic network that can detect earthquake and even provide warnings. A planned public roll-out of MyShake could collect millions of seismic recordings for large earthquakes in many regions around the world.

  8. Potential for a large earthquake near Los Angeles inferred from the 2014 La Habra earthquake.

    PubMed

    Donnellan, Andrea; Grant Ludwig, Lisa; Parker, Jay W; Rundle, John B; Wang, Jun; Pierce, Marlon; Blewitt, Geoffrey; Hensley, Scott

    2015-09-01

    Tectonic motion across the Los Angeles region is distributed across an intricate network of strike-slip and thrust faults that will be released in destructive earthquakes similar to or larger than the 1933  M 6.4 Long Beach and 1994  M 6.7 Northridge events. Here we show that Los Angeles regional thrust, strike-slip, and oblique faults are connected and move concurrently with measurable surface deformation, even in moderate magnitude earthquakes, as part of a fault system that accommodates north-south shortening and westerly tectonic escape of northern Los Angeles. The 28 March 2014 M 5.1 La Habra earthquake occurred on a northeast striking, northwest dipping left-lateral oblique thrust fault northeast of Los Angeles. We present crustal deformation observation spanning the earthquake showing that concurrent deformation occurred on several structures in the shallow crust. The seismic moment of the earthquake is 82% of the total geodetic moment released. Slip within the unconsolidated upper sedimentary layer may reflect shallow release of accumulated strain on still-locked deeper structures. A future M 6.1-6.3 earthquake would account for the accumulated strain. Such an event could occur on any one or several of these faults, which may not have been identified by geologic surface mapping.

  9. Potential for a large earthquake near Los Angeles inferred from the 2014 La Habra earthquake

    PubMed Central

    Grant Ludwig, Lisa; Parker, Jay W.; Rundle, John B.; Wang, Jun; Pierce, Marlon; Blewitt, Geoffrey; Hensley, Scott

    2015-01-01

    Abstract Tectonic motion across the Los Angeles region is distributed across an intricate network of strike‐slip and thrust faults that will be released in destructive earthquakes similar to or larger than the 1933 M6.4 Long Beach and 1994 M6.7 Northridge events. Here we show that Los Angeles regional thrust, strike‐slip, and oblique faults are connected and move concurrently with measurable surface deformation, even in moderate magnitude earthquakes, as part of a fault system that accommodates north‐south shortening and westerly tectonic escape of northern Los Angeles. The 28 March 2014 M5.1 La Habra earthquake occurred on a northeast striking, northwest dipping left‐lateral oblique thrust fault northeast of Los Angeles. We present crustal deformation observation spanning the earthquake showing that concurrent deformation occurred on several structures in the shallow crust. The seismic moment of the earthquake is 82% of the total geodetic moment released. Slip within the unconsolidated upper sedimentary layer may reflect shallow release of accumulated strain on still‐locked deeper structures. A future M6.1–6.3 earthquake would account for the accumulated strain. Such an event could occur on any one or several of these faults, which may not have been identified by geologic surface mapping. PMID:27981074

  10. Prompt Assessment of Global Earthquakes for Response (PAGER): A System for Rapidly Determining the Impact of Earthquakes Worldwide

    USGS Publications Warehouse

    Earle, Paul S.; Wald, David J.; Jaiswal, Kishor S.; Allen, Trevor I.; Hearne, Michael G.; Marano, Kristin D.; Hotovec, Alicia J.; Fee, Jeremy

    2009-01-01

    Within minutes of a significant earthquake anywhere on the globe, the U.S. Geological Survey (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system assesses its potential societal impact. PAGER automatically estimates the number of people exposed to severe ground shaking and the shaking intensity at affected cities. Accompanying maps of the epicentral region show the population distribution and estimated ground-shaking intensity. A regionally specific comment describes the inferred vulnerability of the regional building inventory and, when available, lists recent nearby earthquakes and their effects. PAGER's results are posted on the USGS Earthquake Program Web site (http://earthquake.usgs.gov/), consolidated in a concise one-page report, and sent in near real-time to emergency responders, government agencies, and the media. Both rapid and accurate results are obtained through manual and automatic updates of PAGER's content in the hours following significant earthquakes. These updates incorporate the most recent estimates of earthquake location, magnitude, faulting geometry, and first-hand accounts of shaking. PAGER relies on a rich set of earthquake analysis and assessment tools operated by the USGS and contributing Advanced National Seismic System (ANSS) regional networks. A focused research effort is underway to extend PAGER's near real-time capabilities beyond population exposure to quantitative estimates of fatalities, injuries, and displaced population.

  11. Dynamic strains for earthquake source characterization

    USGS Publications Warehouse

    Barbour, Andrew J.; Crowell, Brendan W

    2017-01-01

    Strainmeters measure elastodynamic deformation associated with earthquakes over a broad frequency band, with detection characteristics that complement traditional instrumentation, but they are commonly used to study slow transient deformation along active faults and at subduction zones, for example. Here, we analyze dynamic strains at Plate Boundary Observatory (PBO) borehole strainmeters (BSM) associated with 146 local and regional earthquakes from 2004–2014, with magnitudes from M 4.5 to 7.2. We find that peak values in seismic strain can be predicted from a general regression against distance and magnitude, with improvements in accuracy gained by accounting for biases associated with site–station effects and source–path effects, the latter exhibiting the strongest influence on the regression coefficients. To account for the influence of these biases in a general way, we include crustal‐type classifications from the CRUST1.0 global velocity model, which demonstrates that high‐frequency strain data from the PBO BSM network carry information on crustal structure and fault mechanics: earthquakes nucleating offshore on the Blanco fracture zone, for example, generate consistently lower dynamic strains than earthquakes around the Sierra Nevada microplate and in the Salton trough. Finally, we test our dynamic strain prediction equations on the 2011 M 9 Tohoku‐Oki earthquake, specifically continuous strain records derived from triangulation of 137 high‐rate Global Navigation Satellite System Earth Observation Network stations in Japan. Moment magnitudes inferred from these data and the strain model are in agreement when Global Positioning System subnetworks are unaffected by spatial aliasing.

  12. Napa Earthquake impact on water systems

    NASA Astrophysics Data System (ADS)

    Wang, J.

    2014-12-01

    South Napa earthquake occurred in Napa, California on August 24 at 3am, local time, and the magnitude is 6.0. The earthquake was the largest in SF Bay Area since the 1989 Loma Prieta earthquake. Economic loss topped $ 1 billion. Wine makers cleaning up and estimated the damage on tourism. Around 15,000 cases of lovely cabernet were pouring into the garden at the Hess Collection. Earthquake potentially raise water pollution risks, could cause water crisis. CA suffered water shortage recent years, and it could be helpful on how to prevent underground/surface water pollution from earthquake. This research gives a clear view on drinking water system in CA, pollution on river systems, as well as estimation on earthquake impact on water supply. The Sacramento-San Joaquin River delta (close to Napa), is the center of the state's water distribution system, delivering fresh water to more than 25 million residents and 3 million acres of farmland. Delta water conveyed through a network of levees is crucial to Southern California. The drought has significantly curtailed water export, and salt water intrusion reduced fresh water outflows. Strong shaking from a nearby earthquake can cause saturated, loose, sandy soils liquefaction, and could potentially damage major delta levee systems near Napa. Napa earthquake is a wake-up call for Southern California. It could potentially damage freshwater supply system.

  13. Nitsche Extended Finite Element Methods for Earthquake Simulation

    NASA Astrophysics Data System (ADS)

    Coon, Ethan T.

    Modeling earthquakes and geologically short-time-scale events on fault networks is a difficult problem with important implications for human safety and design. These problems demonstrate a. rich physical behavior, in which distributed loading localizes both spatially and temporally into earthquakes on fault systems. This localization is governed by two aspects: friction and fault geometry. Computationally, these problems provide a stern challenge for modelers --- static and dynamic equations must be solved on domains with discontinuities on complex fault systems, and frictional boundary conditions must be applied on these discontinuities. The most difficult aspect of modeling physics on complicated domains is the mesh. Most numerical methods involve meshing the geometry; nodes are placed on the discontinuities, and edges are chosen to coincide with faults. The resulting mesh is highly unstructured, making the derivation of finite difference discretizations difficult. Therefore, most models use the finite element method. Standard finite element methods place requirements on the mesh for the sake of stability, accuracy, and efficiency. The formation of a mesh which both conforms to fault geometry and satisfies these requirements is an open problem, especially for three dimensional, physically realistic fault. geometries. In addition, if the fault system evolves over the course of a dynamic simulation (i.e. in the case of growing cracks or breaking new faults), the geometry must he re-meshed at each time step. This can be expensive computationally. The fault-conforming approach is undesirable when complicated meshes are required, and impossible to implement when the geometry is evolving. Therefore, meshless and hybrid finite element methods that handle discontinuities without placing them on element boundaries are a desirable and natural way to discretize these problems. Several such methods are being actively developed for use in engineering mechanics involving crack

  14. Earthquake Hazard in the Heart of the Homeland

    USGS Publications Warehouse

    Gomberg, Joan; Schweig, Eugene

    2007-01-01

    Evidence that earthquakes threaten the Mississippi, Ohio, and Wabash River valleys of the Central United States abounds. In fact, several of the largest historical earthquakes to strike the continental United States occurred in the winter of 1811-1812 along the New Madrid seismic zone, which stretches from just west of Memphis, Tenn., into southern Illinois. Several times in the past century, moderate earthquakes have been widely felt in the Wabash Valley seismic zone along the southern border of Illinois and Indiana. Throughout the region, between 150 and 200 earthquakes are recorded annually by a network of monitoring instruments, although most are too small to be felt by people. Geologic evidence for prehistoric earthquakes throughout the region has been mounting since the late 1970s. But how significant is the threat? How likely are large earthquakes and, more importantly, what is the chance that the shaking they cause will be damaging?

  15. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  16. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.; Liu, A.; Strand, L.

    2012-12-01

    We report on developments in sensor connectivity, architecture, and data fusion algorithms executed in Cloud computing systems in the Community Seismic Network (CSN), a network of low-cost sensors housed in homes and offices by volunteers in the Pasadena, CA area. The network has over 200 sensors continuously reporting anomalies in local acceleration through the Internet to a Cloud computing service (the Google App Engine) that continually fuses sensor data to rapidly detect shaking from earthquakes. The Cloud computing system consists of data centers geographically distributed across the continent and is likely to be resilient even during earthquakes and other local disasters. The region of Southern California is partitioned in a multi-grid style into sets of telescoping cells called geocells. Data streams from sensors within a geocell are fused to detect anomalous shaking across the geocell. Temporal spatial patterns across geocells are used to detect anomalies across regions. The challenge is to detect earthquakes rapidly with an extremely low false positive rate. We report on two data fusion algorithms, one that tessellates the surface so as to fuse data from a large region around Pasadena and the other, which uses a standard tessellation of equal-sized cells. Since September 2011, the network has successfully detected earthquakes of magnitude 2.5 or higher within 40 Km of Pasadena. In addition to the standard USB device, which connects to the host's computer, we have developed a stand-alone sensor that directly connects to the internet via Ethernet or wifi. This bypasses security concerns that some companies have with the USB-connected devices, and allows for 24/7 monitoring at sites that would otherwise shut down their computers after working hours. In buildings we use the sensors to model the behavior of the structures during weak events in order to understand how they will perform during strong events. Visualization models of instrumented buildings ranging

  17. Proceedings of the Regional Seminar on Earthquake Engineering (13th) Held in Istanbul, Turkey on 14-24 September 1987.

    DTIC Science & Technology

    1987-09-01

    Geological Survey, MS977, Menlo Park , CA 94025, USA. , TURKISH NATIONAL COMMITTEE FOR EARTHQUAKE ENGINEERING THIRTEENTH REGIONAL SEMINALR ON EARTQUAKE...this case the conditional probability P(E/F1) will also depend in general on t . A simple example of a case of this type was developed by the present...These studies took Into cosideration all the available date eoncerning the dynamic characteristics of different type * of buildings. A first attempt was

  18. Hybrid Neural-Network: Genetic Algorithm Technique for Aircraft Engine Performance Diagnostics Developed and Demonstrated

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2002-01-01

    As part of the NASA Aviation Safety Program, a unique model-based diagnostics method that employs neural networks and genetic algorithms for aircraft engine performance diagnostics has been developed and demonstrated at the NASA Glenn Research Center against a nonlinear gas turbine engine model. Neural networks are applied to estimate the internal health condition of the engine, and genetic algorithms are used for sensor fault detection, isolation, and quantification. This hybrid architecture combines the excellent nonlinear estimation capabilities of neural networks with the capability to rank the likelihood of various faults given a specific sensor suite signature. The method requires a significantly smaller data training set than a neural network approach alone does, and it performs the combined engine health monitoring objectives of performance diagnostics and sensor fault detection and isolation in the presence of nominal and degraded engine health conditions.

  19. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2016-05-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  20. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    NASA Astrophysics Data System (ADS)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the

  1. The key role of eyewitnesses in rapid earthquake impact assessment

    NASA Astrophysics Data System (ADS)

    Bossu, Rémy; Steed, Robert; Mazet-Roux, Gilles; Roussel, Frédéric; Etivant, Caroline

    2014-05-01

    Uncertainties in rapid earthquake impact models are intrinsically large even when excluding potential indirect losses (fires, landslides, tsunami…). The reason is that they are based on several factors which are themselves difficult to constrain, such as the geographical distribution of shaking intensity, building type inventory and vulnerability functions. The difficulties can be illustrated by two boundary cases. For moderate (around M6) earthquakes, the size of potential damage zone and the epicentral location uncertainty share comparable dimension of about 10-15km. When such an earthquake strikes close to an urban area, like in 1999, in Athens (M5.9), earthquake location uncertainties alone can lead to dramatically different impact scenario. Furthermore, for moderate magnitude, the overall impact is often controlled by individual accidents, like in 2002 in Molise, Italy (M5.7), in Bingol, Turkey (M6.4) in 2003 or in Christchurch, New Zealand (M6.3) where respectively 23 out of 30, 84 out of 176 and 115 out of 185 of the causalities perished in a single building failure. Contrastingly, for major earthquakes (M>7), the point source approximation is not valid anymore, and impact assessment requires knowing exactly where the seismic rupture took place, whether it was unilateral, bilateral etc.… and this information is not readily available directly after the earthquake's occurrence. In-situ observations of actual impact provided by eyewitnesses can dramatically reduce impact models uncertainties. We will present the overall strategy developed at the EMSC which comprises of crowdsourcing and flashsourcing techniques, the development of citizen operated seismic networks, and the use of social networks to engage with eyewitnesses within minutes of an earthquake occurrence. For instance, testimonies are collected through online questionnaires available in 32 languages and automatically processed in maps of effects. Geo-located pictures are collected and then

  2. Earthquake Catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  3. Protocol vulnerability detection based on network traffic analysis and binary reverse engineering.

    PubMed

    Wen, Shameng; Meng, Qingkun; Feng, Chao; Tang, Chaojing

    2017-01-01

    Network protocol vulnerability detection plays an important role in many domains, including protocol security analysis, application security, and network intrusion detection. In this study, by analyzing the general fuzzing method of network protocols, we propose a novel approach that combines network traffic analysis with the binary reverse engineering method. For network traffic analysis, the block-based protocol description language is introduced to construct test scripts, while the binary reverse engineering method employs the genetic algorithm with a fitness function designed to focus on code coverage. This combination leads to a substantial improvement in fuzz testing for network protocols. We build a prototype system and use it to test several real-world network protocol implementations. The experimental results show that the proposed approach detects vulnerabilities more efficiently and effectively than general fuzzing methods such as SPIKE.

  4. 75 FR 65385 - Agency Information Collection Activities: Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-22

    ... Earthquake Engineering Simulation (NEES). SUMMARY: In compliance with the requirement of section 3506(c)(2)(A... of the Network for Earthquake Engineering Simulation. Type of Information Collection Request: New... inform decision making regarding the future of NSF support for earthquake engineering research...

  5. Far-field tsunami of 2017 Mw 8.1 Tehuantepec, Mexico earthquake recorded by Chilean tide gauge network: Implications for tsunami warning systems

    NASA Astrophysics Data System (ADS)

    González-Carrasco, J. F.; Benavente, R. F.; Zelaya, C.; Núñez, C.; Gonzalez, G.

    2017-12-01

    The 2017 Mw 8.1, Tehuantepec earthquake generated a moderated tsunami, which was registered in near-field tide gauges network activating a tsunami threat state for Mexico issued by PTWC. In the case of Chile, the forecast of tsunami waves indicate amplitudes less than 0.3 meters above the tide level, advising an informative state of threat, without activation of evacuation procedures. Nevertheless, during sea level monitoring of network we detect wave amplitudes (> 0.3 m) indicating a possible change of threat state. Finally, NTWS maintains informative level of threat based on mathematical filtering analysis of sea level records. After 2010 Mw 8.8, Maule earthquake, the Chilean National Tsunami Warning System (NTWS) has increased its observational capabilities to improve early response. Most important operational efforts have focused on strengthening tide gauge network for national area of responsibility. Furthermore, technological initiatives as Integrated Tsunami Prediction and Warning System (SIPAT) has segmented the area of responsibility in blocks to focus early warning and evacuation procedures on most affected coastal areas, while maintaining an informative state for distant areas of near-field earthquake. In the case of far-field events, NTWS follow the recommendations proposed by Pacific Tsunami Warning Center (PTWC), including a comprehensive monitoring of sea level records, such as tide gauges and DART (Deep-Ocean Assessment and Reporting of Tsunami) buoys, to evaluate the state of tsunami threat in the area of responsibility. The main objective of this work is to analyze the first-order physical processes involved in the far-field propagation and coastal impact of tsunami, including implications for decision-making of NTWS. To explore our main question, we construct a finite-fault model of the 2017, Mw 8.1 Tehuantepec earthquake. We employ the rupture model to simulate a transoceanic tsunami modeled by Neowave2D. We generate synthetic time series at

  6. Prospects for earthquake prediction and control

    USGS Publications Warehouse

    Healy, J.H.; Lee, W.H.K.; Pakiser, L.C.; Raleigh, C.B.; Wood, M.D.

    1972-01-01

    The San Andreas fault is viewed, according to the concepts of seafloor spreading and plate tectonics, as a transform fault that separates the Pacific and North American plates and along which relative movements of 2 to 6 cm/year have been taking place. The resulting strain can be released by creep, by earthquakes of moderate size, or (as near San Francisco and Los Angeles) by great earthquakes. Microearthquakes, as mapped by a dense seismograph network in central California, generally coincide with zones of the San Andreas fault system that are creeping. Microearthquakes are few and scattered in zones where elastic energy is being stored. Changes in the rate of strain, as recorded by tiltmeter arrays, have been observed before several earthquakes of about magnitude 4. Changes in fluid pressure may control timing of seismic activity and make it possible to control natural earthquakes by controlling variations in fluid pressure in fault zones. An experiment in earthquake control is underway at the Rangely oil field in Colorado, where the rates of fluid injection and withdrawal in experimental wells are being controlled. ?? 1972.

  7. Earthquake Swarm in Armutlu Peninsula, Eastern Marmara Region, Turkey

    NASA Astrophysics Data System (ADS)

    Yavuz, Evrim; Çaka, Deniz; Tunç, Berna; Serkan Irmak, T.; Woith, Heiko; Cesca, Simone; Lühr, Birger-Gottfried; Barış, Şerif

    2015-04-01

    The most active fault system of Turkey is North Anatolian Fault Zone and caused two large earthquakes in 1999. These two earthquakes affected the eastern Marmara region destructively. Unbroken part of the North Anatolian Fault Zone crosses north of Armutlu Peninsula on east-west direction. This branch has been also located quite close to Istanbul known as a megacity with its high population, economic and social aspects. A new cluster of microseismic activity occurred in the direct vicinity southeastern of the Yalova Termal area. Activity started on August 2, 2014 with a series of micro events, and then on August 3, 2014 a local magnitude is 4.1 event occurred, more than 1000 in the followed until August 31, 2014. Thus we call this tentatively a swarm-like activity. Therefore, investigation of the micro-earthquake activity of the Armutlu Peninsula has become important to understand the relationship between the occurrence of micro-earthquakes and the tectonic structure of the region. For these reasons, Armutlu Network (ARNET), installed end of 2005 and equipped with currently 27 active seismic stations operating by Kocaeli University Earth and Space Sciences Research Center (ESSRC) and Helmholtz-Zentrum Potsdam Deutsches GeoForschungsZentrum (GFZ), is a very dense network tool able to record even micro-earthquakes in this region. In the 30 days period of August 02 to 31, 2014 Kandilli Observatory and Earthquake Research Institute (KOERI) announced 120 local earthquakes ranging magnitudes between 0.7 and 4.1, but ARNET provided more than 1000 earthquakes for analyzes at the same time period. In this study, earthquakes of the swarm area and vicinity regions determined by ARNET were investigated. The focal mechanism of the August 03, 2014 22:22:42 (GMT) earthquake with local magnitude (Ml) 4.0 is obtained by the moment tensor solution. According to the solution, it discriminates a normal faulting with dextral component. The obtained focal mechanism solution is

  8. Real-time earthquake source imaging: An offline test for the 2011 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Yong; Wang, Rongjiang; Zschau, Jochen; Parolai, Stefano; Dahm, Torsten

    2014-05-01

    In recent decades, great efforts have been expended in real-time seismology aiming at earthquake and tsunami early warning. One of the most important issues is the real-time assessment of earthquake rupture processes using near-field seismogeodetic networks. Currently, earthquake early warning systems are mostly based on the rapid estimate of P-wave magnitude, which contains generally large uncertainties and the known saturation problem. In the case of the 2011 Mw9.0 Tohoku earthquake, JMA (Japan Meteorological Agency) released the first warning of the event with M7.2 after 25 s. The following updates of the magnitude even decreased to M6.3-6.6. Finally, the magnitude estimate stabilized at M8.1 after about two minutes. This led consequently to the underestimated tsunami heights. By using the newly developed Iterative Deconvolution and Stacking (IDS) method for automatic source imaging, we demonstrate an offline test for the real-time analysis of the strong-motion and GPS seismograms of the 2011 Tohoku earthquake. The results show that we had been theoretically able to image the complex rupture process of the 2011 Tohoku earthquake automatically soon after or even during the rupture process. In general, what had happened on the fault could be robustly imaged with a time delay of about 30 s by using either the strong-motion (KiK-net) or the GPS (GEONET) real-time data. This implies that the new real-time source imaging technique is helpful to reduce false and missing warnings, and therefore should play an important role in future tsunami early warning and earthquake rapid response systems.

  9. High-resolution earthquake relocation in the Fort Worth and Permian Basins using regional seismic stations

    NASA Astrophysics Data System (ADS)

    Ogwari, P.; DeShon, H. R.; Hornbach, M.

    2017-12-01

    Post-2008 earthquake rate increases in the Central United States have been associated with large-scale subsurface disposal of waste-fluids from oil and gas operations. The beginning of various earthquake sequences in Fort Worth and Permian basins have occurred in the absence of seismic stations at local distances to record and accurately locate hypocenters. Most typically, the initial earthquakes have been located using regional seismic network stations (>100km epicentral distance) and using global 1D velocity models, which usually results in large location uncertainty, especially in depth, does not resolve magnitude <2.5 events, and does not constrain the geometry of the activated fault(s). Here, we present a method to better resolve earthquake occurrence and location using matched filters and regional relative location when local data becomes available. We use the local distance data for high-resolution earthquake location, identifying earthquake templates and accurate source-station raypath velocities for the Pg and Lg phases at regional stations. A matched-filter analysis is then applied to seismograms recorded at US network stations and at adopted TA stations that record the earthquakes before and during the local network deployment period. Positive detections are declared based on manual review of associated with P and S arrivals on local stations. We apply hierarchical clustering to distinguish earthquakes that are both spatially clustered and spatially separated. Finally, we conduct relative earthquake and earthquake cluster location using regional station differential times. Initial analysis applied to the 2008-2009 DFW airport sequence in north Texas results in time continuous imaging of epicenters extending into 2014. Seventeen earthquakes in the USGS earthquake catalog scattered across a 10km2 area near DFW airport are relocated onto a single fault using these approaches. These techniques will also be applied toward imaging recent earthquakes in the

  10. How social networks influence female students' choices to major in engineering

    NASA Astrophysics Data System (ADS)

    Weinland, Kathryn Ann

    Scope and Method of Study: This study examined how social influence plays a part in female students' choices of college major, specifically engineering instead of science, technology, and math. Social influence may show itself through peers, family members, and teachers and may encompass resources under the umbrella of social capital. The purpose of this study was to examine how female students' social networks, through the lens of social capital, influence her major choice of whether or not to study engineering. The variables of peer influence, parental influence, teacher/counselor influence, perception of engineering, and academic background were addressed in a 52 question, Likert scale survey. This survey has been modified from an instrument previously used by Reyer (2007) at Bradley University. Data collection was completed using the Dillman (2009) tailored design model. Responses were grouped into four main scales of the dependent variables of social influence, encouragement, perceptions of engineering and career motivation. A factor analysis was completed on the four factors as a whole, and individual questions were not be analyzed. Findings and Conclusions: This study addressed the differences in social network support for female freshmen majoring in engineering versus female freshmen majoring in science, technology, or math. Social network support, when working together from all angles of peers, teachers, parents, and teachers/counselors, transforms itself into a new force that is more powerful than the summation of the individual parts. Math and science preparation also contributed to female freshmen choosing to major in engineering instead of choosing to major in science, technology, or math. The STEM pipeline is still weak and ways in which to reinforce it should be examined. Social network support is crucial for female freshmen who are majoring in science, technology, engineering, and math.

  11. Engineering-geological model of the landslide of Güevejar (S Spain) reactivated by historical earthquakes

    NASA Astrophysics Data System (ADS)

    Delgado, José; García-Tortosa, Francisco J.; Garrido, Jesús; Giner, José; Lenti, Luca; López-Casado, Carlos; Martino, Salvatore; Peláez, José A.; Sanz de Galdeano, Carlos; Soler, Juan L.

    2015-04-01

    Landslides are a common ground effect induced by earthquakes of moderate to large magnitude. Most of them correspond to first-time instabilities induced by the seismic event, being the reactivation of pre-existing landslides less frequent in practice. The landslide of Güevejar (Granada province, S Spain) represents a case study of landslide that was reactivated, at least, two times by far field earthquakes: the Mw 8.7, 1755, Lisbon earthquake (with estimated epicentral distance of 680 km), and the Mw 6.5, 1884, Andalucia event (estimated epicentral distance of 45 km), but not by near field events of moderate magnitude (Mw < 6.0 and epicentral distances lower than 25 km). To study the seismic response of this landslide, a study has been conducted to elaborate an engineering-geological model. For this purpose, field work done included the elaboration of a detailed geological map (1:1000) of the landslide and surrounding areas, drilling of deep boreholes (80 m deep), down-hole measurement of both P and S wave velocities in the boreholes drilled, piezometric control of water table, MASW and ReMi profiles for determining the underlying structure of the sites tested (soil profile stratigraphy and the corresponding S-wave velocity of each soil level) and undisturbed sampling of the materials affected by the landslide. These samples were then tested in laboratory according to standard procedures for determination of both static (among which soil density, soil classification and shear strength) and dynamic properties (degradation curves for shear modulus and damping ratio with shear strain) of the landslide-involved materials. The model proposed corresponds to a complex landslide that combines a rototranslational mechanism with an earth-flow at its toe, which is characterized by a deep (> 50 m) sliding surface. The engineering-geological model constitutes the first step in an ongoing research devoted to understand how it could be reactivated during far field events. The

  12. Operability engineering in the Deep Space Network

    NASA Technical Reports Server (NTRS)

    Wilkinson, Belinda

    1993-01-01

    Many operability problems exist at the three Deep Space Communications Complexes (DSCC's) of the Deep Space Network (DSN). Four years ago, the position of DSN Operability Engineer was created to provide the opportunity for someone to take a system-level approach to solving these problems. Since that time, a process has been developed for personnel and development engineers and for enforcing user interface standards in software designed for the DSCC's. Plans are for the participation of operations personnel in the product life-cycle to expand in the future.

  13. GPS Monitoring of Surface Change During and Following the Fortuitous Occurrence of the M(sub w) = 7.3 Landers Earthquake in our Network

    NASA Technical Reports Server (NTRS)

    Miller, M. Meghan

    1998-01-01

    Accomplishments: (1) Continues GPS monitoring of surface change during and following the fortuitous occurrence of the M(sub w) = 7.3 Landers earthquake in our network, in order to characterize earthquake dynamics and accelerated activity of related faults as far as 100's of kilometers along strike. (2) Integrates the geodetic constraints into consistent kinematic descriptions of the deformation field that can in turn be used to characterize the processes that drive geodynamics, including seismic cycle dynamics. In 1991, we installed and occupied a high precision GPS geodetic network to measure transform-related deformation that is partitioned from the Pacific - North America plate boundary northeastward through the Mojave Desert, via the Eastern California shear zone to the Walker Lane. The onset of the M(sub w) = 7.3 June 28, 1992, Landers, California, earthquake sequence within this network poses unique opportunities for continued monitoring of regional surface deformation related to the culmination of a major seismic cycle, characterization of the dynamic behavior of continental lithosphere during the seismic sequence, and post-seismic transient deformation. During the last year, we have reprocessed all three previous epochs for which JPL fiducial free point positioning products available and are queued for the remaining needed products, completed two field campaigns monitoring approx. 20 sites (October 1995 and September 1996), begun modeling by development of a finite element mesh based on network station locations, and developed manuscripts dealing with both the Landers-related transient deformation at the latitude of Lone Pine and the velocity field of the whole experiment. We are currently deploying a 1997 observation campaign (June 1997). We use GPS geodetic studies to characterize deformation in the Mojave Desert region and related structural domains to the north, and geophysical modeling of lithospheric behavior. The modeling is constrained by our

  14. Along-strike Variations in the Himalayas Illuminated by the Aftershock Sequence of the 2015 Mw 7.8 Gorkha Earthquake Using the NAMASTE Local Seismic Network

    NASA Astrophysics Data System (ADS)

    Mendoza, M.; Ghosh, A.; Karplus, M. S.; Nabelek, J.; Sapkota, S. N.; Adhikari, L. B.; Klemperer, S. L.; Velasco, A. A.

    2016-12-01

    As a result of the 2015 Mw 7.8 Gorkha earthquake, more than 8,000 people were killed from a combination of infrastructure failure and triggered landslides. This earthquake produced 4 m of peak co-seismic slip as the fault ruptured 130 km east under densely populated cities, such as Kathmandu. To understand earthquake dynamics in this part of the Himalayas and help mitigate similar future calamities by the next destructive event, it is imperative to study earthquake activities in detail and improve our understanding of the source and structural complexities. In response to the Gorkha event, multiple institutions developed and deployed a 10-month long dense seismic network called NAMASTE. It blanketed a 27,650 km2 area, mainly covering the rupture area of the Gorkha earthquake, in order to capture the dynamic sequence of aftershock behavior. The network consisted of a mix of 45 broadband, short-period, and strong motion sensors, with an average spacing of 20 km. From the first 6 months of data, starting approximately 1.5 after the mainshock, we develop a robust catalog containing over 3,000 precise earthquake locations, and local magnitudes that range between 0.3 and 4.9. The catalog has a magnitude of completeness of 1.5, and an overall low b-value of 0.78. Using the HypoDD algorithm, we relocate earthquake hypocenters with high precision, and thus illustrate the fault geometry down to depths of 25 km where we infer the location of the gently-dipping Main Frontal Thrust (MFT). Above the MFT, the aftershocks illuminate complex structure produced by relatively steeply dipping faults. Interestingly, we observe sharp along-strike change in the seismicity pattern. The eastern part of the aftershock area is significantly more active than the western part. The change in seismicity may reflect structural and/or frictional lateral heterogeneity in this part of the Himalayan fault system. Such along-strike variations play an important role in rupture complexities and

  15. Differential energy radiation from two earthquakes in Japan with identical Mw: The Kyushu 1996 and Tottori 2000 earthquakes

    USGS Publications Warehouse

    Choy, G.L.; Boatwright, J.

    2009-01-01

    We examine two closely located earthquakes in Japan that had identical moment magnitudes Mw but significantly different energy magnitudes Me. We use teleseismic data from the Global Seismograph Network and strong-motion data from the National Research Institute for Earth Science and Disaster Prevention's K-Net to analyze the 19 October 1996 Kyushu earthquake (Mw 6.7, Me 6.6) and the 6 October 2000 Tottori earthquake (Mw 6.7, Me 7.4). To obtain regional estimates of radiated energy ES we apply a spectral technique to regional (<200 km) waveforms that are dominated by S and Lg waves. For the thrust-fault Kyushu earthquake, we estimate an average regional attenuation Q(f) 230f0:65. For the strike-slip Tottori earthquake, the average regional attenuation is Q(f) 180f0:6. These attenuation functions are similar to those derived from studies of both California and Japan earthquakes. The regional estimate of ES for the Kyushu earthquake, 3:8 ?? 1014 J, is significantly smaller than that for the Tottori earthquake, ES 1:3 ?? 1015 J. These estimates correspond well with the teleseismic estimates of 3:9 ?? 1014 J and 1:8 ?? 1015 J, respectively. The apparent stress (Ta = ??Es/M0 with ?? equal to rigidity) for the Kyushu earthquake is 4 times smaller than the apparent stress for the Tottori earthquake. In terms of the fault maturity model, the significantly greater release of energy by the strike-slip Tottori earthquake can be related to strong deformation in an immature intraplate setting. The relatively lower energy release of the thrust-fault Kyushu earthquake can be related to rupture on mature faults at a subduction environment. The consistence between teleseismic and regional estimates of ES is particularly significant as teleseismic data for computing ES are routinely available for all large earthquakes whereas often there are no near-field data.

  16. Earthquake Early Warning and Public Policy: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Goltz, J. D.; Bourque, L.; Tierney, K.; Riopelle, D.; Shoaf, K.; Seligson, H.; Flores, P.

    2003-12-01

    Development of an earthquake early warning capability and pilot project were objectives of TriNet, a 5-year (1997-2001) FEMA-funded project to develop a state-of-the-art digital seismic network in southern California. In parallel with research to assemble a protocol for rapid analysis of earthquake data and transmission of a signal by TriNet scientists and engineers, the public policy, communication and educational issues inherent in implementation of an earthquake early warning system were addressed by TriNet's outreach component. These studies included: 1) a survey that identified potential users of an earthquake early warning system and how an earthquake early warning might be used in responding to an event, 2) a review of warning systems and communication issues associated with other natural hazards and how lessons learned might be applied to an alerting system for earthquakes, 3) an analysis of organization, management and public policy issues that must be addressed if a broad-based warning system is to be developed and 4) a plan to provide earthquake early warnings to a small number of organizations in southern California as an experimental prototype. These studies provided needed insights into the social and cultural environment in which this new technology will be introduced, an environment with opportunities to enhance our response capabilities but also an environment with significant barriers to overcome to achieve a system that can be sustained and supported. In this presentation we will address the main public policy issues that were subjects of analysis in these studies. They include a discussion of the possible division of functions among organizations likely to be the principle partners in the management of an earthquake early warning system. Drawing on lessons learned from warning systems for other hazards, we will review the potential impacts of false alarms and missed events on warning system credibility, the acceptability of fully automated

  17. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    also be both specific (although allowably uncertain) and actionable. In this analysis, an attempt is made at both simple and intuitive color-coded alerting criteria; yet the necessary uncertainty measures by which one can gauge the likelihood for the alert to be over- or underestimated are preserved. The essence of the proposed impact scale and alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide on the basis of quantifiable loss estimates. Utilizing EIS, PAGER's rapid loss estimates can adequately recommend alert levels and suggest appropriate response protocols, despite the uncertainties; demanding or awaiting observations or loss estimates with a high level of accuracy may increase the losses. ?? 2011 American Society of Civil Engineers.

  18. Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Hearne, Mike

    2009-01-01

    We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.

  19. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and

  20. A Hybrid Neural Network-Genetic Algorithm Technique for Aircraft Engine Performance Diagnostics

    NASA Technical Reports Server (NTRS)

    Kobayashi, Takahisa; Simon, Donald L.

    2001-01-01

    In this paper, a model-based diagnostic method, which utilizes Neural Networks and Genetic Algorithms, is investigated. Neural networks are applied to estimate the engine internal health, and Genetic Algorithms are applied for sensor bias detection and estimation. This hybrid approach takes advantage of the nonlinear estimation capability provided by neural networks while improving the robustness to measurement uncertainty through the application of Genetic Algorithms. The hybrid diagnostic technique also has the ability to rank multiple potential solutions for a given set of anomalous sensor measurements in order to reduce false alarms and missed detections. The performance of the hybrid diagnostic technique is evaluated through some case studies derived from a turbofan engine simulation. The results show this approach is promising for reliable diagnostics of aircraft engines.

  1. Collaborative-Large scale Engineering Assessment Networks for Environmental Research: The Overview

    NASA Astrophysics Data System (ADS)

    Moo-Young, H.

    2004-05-01

    A networked infrastructure for engineering solutions and policy alternatives is necessary to assess, manage, and protect complex, anthropogenic ally stressed environmental resources effectively. Reductionist and discrete disciplinary methodologies are no longer adequate to evaluate and model complex environmental systems and anthropogenic stresses. While the reductonist approach provides important information regarding individual mechanisms, it cannot provide complete information about how multiple processes are related. Therefore, it is not possible to make accurate predictions about system responses to engineering interventions and the effectiveness of policy options. For example, experts cannot agree on best management strategies for contaminated sediments in riverine and estuarine systems. This is due, in part to the fact that existing models do not accurately capture integrated system dynamics. In addition, infrastructure is not available for investigators to exchange and archive data, to collaborate on new investigative methods, and to synthesize these results to develop engineering solutions and policy alternatives. Our vision for the future is to create a network comprising field facilities and a collaboration of engineers, scientists, policy makers, and community groups. This will allow integration across disciplines, across different temporal and spatial scales, surface and subsurface geographies, and air sheds and watersheds. Benefits include fast response to changes in system health, real-time decision making, and continuous data collection that can be used to anticipate future problems, and to develop sound engineering solutions and management decisions. CLEANER encompasses four general aspects: 1) A Network of environmental field facilities instrumented for the acquisition and analysis of environmental data; 2) A Virtual Repository of Data and information technology for engineering modeling, analysis and visualization of data, i.e. an environmental

  2. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5earthquakes increased rapidly. An example of a human-triggered earthquake is the 1989 Newcastle event in Australia that was a result of almost 200 years of coal mining and water over-exploitation, respectively. This earthquake, an Mw=5.6 event, caused more than 3.5 billion U.S. dollars in damage (1989 value) and was responsible for Australia's first and only to date earthquake fatalities. It is therefore thought that, the Newcastle region tends to develop unsustainably if comparing economic growth due to mining and financial losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all

  3. Parameter estimation in spiking neural networks: a reverse-engineering approach.

    PubMed

    Rostro-Gonzalez, H; Cessac, B; Vieville, T

    2012-04-01

    This paper presents a reverse engineering approach for parameter estimation in spiking neural networks (SNNs). We consider the deterministic evolution of a time-discretized network with spiking neurons, where synaptic transmission has delays, modeled as a neural network of the generalized integrate and fire type. Our approach aims at by-passing the fact that the parameter estimation in SNN results in a non-deterministic polynomial-time hard problem when delays are to be considered. Here, this assumption has been reformulated as a linear programming (LP) problem in order to perform the solution in a polynomial time. Besides, the LP problem formulation makes the fact that the reverse engineering of a neural network can be performed from the observation of the spike times explicit. Furthermore, we point out how the LP adjustment mechanism is local to each neuron and has the same structure as a 'Hebbian' rule. Finally, we present a generalization of this approach to the design of input-output (I/O) transformations as a practical method to 'program' a spiking network, i.e. find a set of parameters allowing us to exactly reproduce the network output, given an input. Numerical verifications and illustrations are provided.

  4. Reverse engineering and analysis of large genome-scale gene networks

    PubMed Central

    Aluru, Maneesha; Zola, Jaroslaw; Nettleton, Dan; Aluru, Srinivas

    2013-01-01

    Reverse engineering the whole-genome networks of complex multicellular organisms continues to remain a challenge. While simpler models easily scale to large number of genes and gene expression datasets, more accurate models are compute intensive limiting their scale of applicability. To enable fast and accurate reconstruction of large networks, we developed Tool for Inferring Network of Genes (TINGe), a parallel mutual information (MI)-based program. The novel features of our approach include: (i) B-spline-based formulation for linear-time computation of MI, (ii) a novel algorithm for direct permutation testing and (iii) development of parallel algorithms to reduce run-time and facilitate construction of large networks. We assess the quality of our method by comparison with ARACNe (Algorithm for the Reconstruction of Accurate Cellular Networks) and GeneNet and demonstrate its unique capability by reverse engineering the whole-genome network of Arabidopsis thaliana from 3137 Affymetrix ATH1 GeneChips in just 9 min on a 1024-core cluster. We further report on the development of a new software Gene Network Analyzer (GeNA) for extracting context-specific subnetworks from a given set of seed genes. Using TINGe and GeNA, we performed analysis of 241 Arabidopsis AraCyc 8.0 pathways, and the results are made available through the web. PMID:23042249

  5. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  6. Fiber-Optic Network Observations of Earthquake Wavefields

    NASA Astrophysics Data System (ADS)

    Lindsey, Nathaniel J.; Martin, Eileen R.; Dreger, Douglas S.; Freifeld, Barry; Cole, Stephen; James, Stephanie R.; Biondi, Biondo L.; Ajo-Franklin, Jonathan B.

    2017-12-01

    Our understanding of subsurface processes suffers from a profound observation bias: seismometers are sparse and clustered on continents. A new seismic recording approach, distributed acoustic sensing (DAS), transforms telecommunication fiber-optic cables into sensor arrays enabling meter-scale recording over tens of kilometers of linear fiber length. We analyze cataloged earthquake observations from three DAS arrays with different horizontal geometries to demonstrate some possibilities using this technology. In Fairbanks, Alaska, we find that stacking ground motion records along 20 m of fiber yield a waveform that shows a high degree of correlation in amplitude and phase with a colocated inertial seismometer record at 0.8-1.6 Hz. Using an L-shaped DAS array in Northern California, we record the nearly vertically incident arrival of an earthquake from The Geysers Geothermal Field and estimate its backazimuth and slowness via beamforming for different phases of the seismic wavefield. Lastly, we install a fiber in existing telecommunications conduits below Stanford University and show that little cable-to-soil coupling is required for teleseismic P and S phase arrival detection.

  7. Discrimination of nuclear explosions and earthquakes from teleseismic distances with a local network of short period seismic stations using artificial neural networks

    NASA Astrophysics Data System (ADS)

    Tiira, Timo

    1996-10-01

    Seismic discrimination capability of artificial neural networks (ANNs) was studied using earthquakes and nuclear explosions from teleseismic distances. The events were selected from two areas, which were analyzed separately. First, 23 nuclear explosions from Semipalatinsk and Lop Nor test sites were compared with 46 earthquakes from adjacent areas. Second, 39 explosions from Nevada test site were compared with 27 earthquakes from close-by areas. The basic discriminants were complexity, spectral ratio and third moment of frequency. The spectral discriminants were computed in five different ways to obtain all the information embedded in the signals, some of which were relatively weak. The discriminants were computed using data from six short period stations in Central and southern Finland. The spectral contents of the signals of both classes varied considerably between the stations. The 66 discriminants were formed into 65 optimum subsets of different sizes by using stepwise linear regression. A type of ANN called multilayer perceptron (MLP) was applied to each of the subsets. As a comparison the classification was repeated using linear discrimination analysis (LDA). Since the number of events was small the testing was made with the leave-one-out method. The ANN gave significantly better results than LDA. As a final tool for discrimination a combination of the ten neural nets with the best performance were used. All events from Central Asia were clearly discriminated and over 90% of the events from Nevada region were confidently discriminated. The better performance of ANNs was attributed to its ability to form complex decision regions between the groups and to its highly non-linear nature.

  8. Did you feel it? Community-made earthquake shaking maps

    USGS Publications Warehouse

    Wald, D.J.; Wald, L.A.; Dewey, J.W.; Quitoriano, Vince; Adams, Elisabeth

    2001-01-01

    Since the early 1990's, the magnitude and location of an earthquake have been available within minutes on the Internet. Now, as a result of work by the U.S. Geological Survey (USGS) and with the cooperation of various regional seismic networks, people who experience an earthquake can go online and share information about its effects to help create a map of shaking intensities and damage. Such 'Community Internet Intensity Maps' (CIIM's) contribute greatly in quickly assessing the scope of an earthquake emergency, even in areas lacking seismic instruments.

  9. On the Diurnal Periodicity of Representative Earthquakes in Greece: Comparison of Data from Different Observation Systems

    NASA Astrophysics Data System (ADS)

    Desherevskii, A. V.; Sidorin, A. Ya.

    2017-12-01

    Due to the initiation of the Hellenic Unified Seismic Network (HUSN) in late 2007, the quality of observation significantly improved by 2011. For example, the representative magnitude level considerably has decreased and the number of annually recorded events has increased. The new observational system highly expanded the possibilities for studying regularities in seismicity. In view of this, the authors revisited their studies of the diurnal periodicity of representative earthquakes in Greece that was revealed earlier in the earthquake catalog before 2011. We use 18 samples of earthquakes of different magnitudes taken from the catalog of Greek earthquakes from 2011 to June 2016 to derive a series of the number of earthquakes for each of them and calculate its average diurnal course. To increase the reliability of the results, we compared the data for two regions. With a high degree of statistical significance, we have obtained that no diurnal periodicity can be found for strongly representative earthquakes. This finding differs from the estimates obtained earlier from an analysis of the catalog of earthquakes at the same area for 1995-2004 and 2005-2010, i.e., before the initiation of the Hellenic Unified Seismic Network. The new results are consistent with the hypothesis of noise discrimination (observational selection) explaining the cause of the diurnal variation of earthquakes with different sensitivity of the seismic network in daytime and nighttime periods.

  10. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  11. The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake

    NASA Astrophysics Data System (ADS)

    Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.

    2008-12-01

    In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

  12. Reverse engineering biological networks :applications in immune responses to bio-toxins.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Martino, Anthony A.; Sinclair, Michael B.; Davidson, George S.

    Our aim is to determine the network of events, or the regulatory network, that defines an immune response to a bio-toxin. As a model system, we are studying T cell regulatory network triggered through tyrosine kinase receptor activation using a combination of pathway stimulation and time-series microarray experiments. Our approach is composed of five steps (1) microarray experiments and data error analysis, (2) data clustering, (3) data smoothing and discretization, (4) network reverse engineering, and (5) network dynamics analysis and fingerprint identification. The technological outcome of this study is a suite of experimental protocols and computational tools that reverse engineermore » regulatory networks provided gene expression data. The practical biological outcome of this work is an immune response fingerprint in terms of gene expression levels. Inferring regulatory networks from microarray data is a new field of investigation that is no more than five years old. To the best of our knowledge, this work is the first attempt that integrates experiments, error analyses, data clustering, inference, and network analysis to solve a practical problem. Our systematic approach of counting, enumeration, and sampling networks matching experimental data is new to the field of network reverse engineering. The resulting mathematical analyses and computational tools lead to new results on their own and should be useful to others who analyze and infer networks.« less

  13. Cascade Optimization for Aircraft Engines With Regression and Neural Network Analysis - Approximators

    NASA Technical Reports Server (NTRS)

    Patnaik, Surya N.; Guptill, James D.; Hopkins, Dale A.; Lavelle, Thomas M.

    2000-01-01

    The NASA Engine Performance Program (NEPP) can configure and analyze almost any type of gas turbine engine that can be generated through the interconnection of a set of standard physical components. In addition, the code can optimize engine performance by changing adjustable variables under a set of constraints. However, for engine cycle problems at certain operating points, the NEPP code can encounter difficulties: nonconvergence in the currently implemented Powell's optimization algorithm and deficiencies in the Newton-Raphson solver during engine balancing. A project was undertaken to correct these deficiencies. Nonconvergence was avoided through a cascade optimization strategy, and deficiencies associated with engine balancing were eliminated through neural network and linear regression methods. An approximation-interspersed cascade strategy was used to optimize the engine's operation over its flight envelope. Replacement of Powell's algorithm by the cascade strategy improved the optimization segment of the NEPP code. The performance of the linear regression and neural network methods as alternative engine analyzers was found to be satisfactory. This report considers two examples-a supersonic mixed-flow turbofan engine and a subsonic waverotor-topped engine-to illustrate the results, and it discusses insights gained from the improved version of the NEPP code.

  14. Induced earthquake magnitudes are as large as (statistically) expected

    USGS Publications Warehouse

    Van Der Elst, Nicholas; Page, Morgan T.; Weiser, Deborah A.; Goebel, Thomas; Hosseini, S. Mehran

    2016-01-01

    A major question for the hazard posed by injection-induced seismicity is how large induced earthquakes can be. Are their maximum magnitudes determined by injection parameters or by tectonics? Deterministic limits on induced earthquake magnitudes have been proposed based on the size of the reservoir or the volume of fluid injected. However, if induced earthquakes occur on tectonic faults oriented favorably with respect to the tectonic stress field, then they may be limited only by the regional tectonics and connectivity of the fault network. In this study, we show that the largest magnitudes observed at fluid injection sites are consistent with the sampling statistics of the Gutenberg-Richter distribution for tectonic earthquakes, assuming no upper magnitude bound. The data pass three specific tests: (1) the largest observed earthquake at each site scales with the log of the total number of induced earthquakes, (2) the order of occurrence of the largest event is random within the induced sequence, and (3) the injected volume controls the total number of earthquakes rather than the total seismic moment. All three tests point to an injection control on earthquake nucleation but a tectonic control on earthquake magnitude. Given that the largest observed earthquakes are exactly as large as expected from the sampling statistics, we should not conclude that these are the largest earthquakes possible. Instead, the results imply that induced earthquake magnitudes should be treated with the same maximum magnitude bound that is currently used to treat seismic hazard from tectonic earthquakes.

  15. Social media networking: YouTube and search engine optimization.

    PubMed

    Jackson, Rem; Schneider, Andrew; Baum, Neil

    2011-01-01

    This is the third part of a three-part article on social media networking. This installment will focus on YouTube and search engine optimization. This article will explore the application of YouTube to the medical practice and how YouTube can help a practice retain its existing patients and attract new patients to the practice. The article will also describe the importance of search engine optimization and how to make your content appear on the first page of the search engines such as Google, Yahoo, and YouTube.

  16. Postseismic Transient after the 2002 Denali Fault Earthquake from VLBI Measurements at Fairbanks

    NASA Technical Reports Server (NTRS)

    MacMillan, Daniel; Cohen, Steven

    2004-01-01

    The VLBI antenna (GILCREEK) at Fairbanks, Alaska observes in networks routinely twice a week with operational networks and on additional days with other networks on a more uneven basis. The Fairbanks antenna position is about 150 km north of the Denali fault and from the earthquake epicenter. We examine the transient behavior of the estimated VLBI position during the year following the earthquake to determine how the rate of change of postseismic deformation has changed. This is compared with what is seen in the GPS site position series.

  17. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  18. Composite Earthquake Catalog of the Yellow Sea for Seismic Hazard Studies

    NASA Astrophysics Data System (ADS)

    Kang, S. Y.; Kim, K. H.; LI, Z.; Hao, T.

    2017-12-01

    The Yellow Sea (a.k.a West Sea in Korea) is an epicontinental and semi-closed sea located between Korea and China. Recent earthquakes in the Yellow Sea including, but not limited to, the Seogyuckryulbi-do (1 April 2014, magnitude 5.1), Heuksan-do (21 April 2013, magnitude 4.9), Baekryung-do (18 May 2013, magnitude 4.9) earthquakes, and the earthquake swarm in the Boryung offshore region in 2013, remind us of the seismic hazards affecting east Asia. This series of earthquakes in the Yellow Sea raised numerous questions. Unfortunately, both governments have trouble in monitoring seismicity in the Yellow Sea because earthquakes occur beyond their seismic networks. For example, the epicenters of the magnitude 5.1 earthquake in the Seogyuckryulbi-do region in 2014 reported by the Korea Meteorological Administration and China Earthquake Administration differed by approximately 20 km. This illustrates the difficulty with seismic monitoring and locating earthquakes in the region, despite the huge effort made by both governments. Joint effort is required not only to overcome the limits posed by political boundaries and geographical location but also to study seismicity and the underground structures responsible. Although the well-established and developing seismic networks in Korea and China have provided unprecedented amount and quality of seismic data, high quality catalog is limited to the recent 10s of years, which is far from major earthquake cycle. It is also noticed the earthquake catalog from either country is biased to its own and cannot provide complete picture of seismicity in the Yellow Sea. In order to understand seismic hazard and tectonics in the Yellow Sea, a composite earthquake catalog has been developed. We gathered earthquake information during last 5,000 years from various sources. There are good reasons to believe that some listings account for same earthquake, but in different source parameters. We established criteria in order to provide consistent

  19. Earthquakes in the Classroom, Las Vegas, NV: The Nevada Educational Seismic Network (NESN)

    NASA Astrophysics Data System (ADS)

    Hopkins, J.; Snelson, C. M.; Zaragoza, S. A.; Smith, K.; Depolo, D.

    2002-12-01

    Geophysics is a term guaranteed to strike fear into the heart of the bravest high school science student. Using math to describe the earth can involve complex equations that can only be deciphered by enigmatic computer programs. But high school science students in the Las Vegas Valley have been given a unique opportunity to gather important research information while learning about geophysics, real-time data collection, and Internet communications in a less threatening environment. Three seismograph stations funded by the Department of Energy and the University of Nevada, Las Vegas have been installed in three different high schools in the Clark County School District. These three stations form a triangle in the Las Vegas Valley basin covering areas where the basin depths change significantly. The geophones are buried outside and a cable connects the sensors and GPS receiver to a digitizer on a local PC. The data is transmitted continuously in real-time via Internet communications protocols to the Seismic Explorer Monitoring Network. There it is available to all schools and to researchers who will analyze the data. These short-period geophones will record small local earthquakes and larger more distant events contributing to real-time seismic network operations in southern Nevada. Students at a school site are able to see live real-time data from other school stations as well as from seismograph stations in southern Nevada, the western US, and the world. Mentored by researchers at the University of Nevada, Reno and University of Nevada, Las Vegas, the teachers and students conduct simple waveform analysis to determine earthquake locations and magnitudes and operate the stations in this cooperative research effort. The goal of this partnership between secondary and university educational systems is to create a successful alliance that will benefit the research community as well as the classroom teacher and his/her students. Researchers will use the data collected

  20. Challenges to communicate risks of human-caused earthquakes

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2014-12-01

    The awareness of natural hazards has been up-trending in recent years. In particular, this is true for earthquakes, which increase in frequency and magnitude in regions that normally do not experience seismic activity. In fact, one of the major concerns for many communities and businesses is that humans today seem to cause earthquakes due to large-scale shale gas production, dewatering and flooding of mines and deep geothermal power production. Accordingly, without opposing any of these technologies it should be a priority of earth scientists who are researching natural hazards to communicate earthquake risks. This presentation discusses the challenges that earth scientists are facing to properly communicate earthquake risks, in light of the fact that human-caused earthquakes are an environmental change affecting only some communities and businesses. Communication channels may range from research papers, books and class room lectures to outreach events and programs, popular media events or even social media networks.

  1. Earthquakes trigger the loss of groundwater biodiversity

    NASA Astrophysics Data System (ADS)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  2. Earthquakes trigger the loss of groundwater biodiversity.

    PubMed

    Galassi, Diana M P; Lombardo, Paola; Fiasca, Barbara; Di Cioccio, Alessia; Di Lorenzo, Tiziana; Petitta, Marco; Di Carlo, Piero

    2014-09-03

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and "ecosystem engineers", we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  3. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1, 1994 through December 31, 1999

    USGS Publications Warehouse

    Jolly, Arthur D.; Stihler, Scott D.; Power, John A.; Lahr, John C.; Paskievitch, John; Tytgat, Guy; Estes, Steve; Lockhart, Andrew B.; Moran, Seth C.; McNutt, Stephen R.; Hammond, William R.

    2001-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska - Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained a seismic monitoring program at potentially active volcanoes in Alaska since 1988 (Power and others, 1993; Jolly and others, 1996). The primary objectives of this program are the seismic surveillance of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism.Between 1994 and 1999, the AVO seismic monitoring program underwent significant changes with networks added at new volcanoes during each summer from 1995 through 1999. The existing network at Katmai –Valley of Ten Thousand Smokes (VTTS) was repaired in 1995, and new networks were installed at Makushin (1996), Akutan (1996), Pavlof (1996), Katmai - south (1996), Aniakchak (1997), Shishaldin (1997), Katmai - north (1998), Westdahl, (1998), Great Sitkin (1999) and Kanaga (1999). These networks added to AVO's existing seismograph networks in the Cook Inlet area and increased the number of AVO seismograph stations from 46 sites and 57 components in 1994 to 121 sites and 155 components in 1999. The 1995–1999 seismic network expansion increased the number of volcanoes monitored in real-time from 4 to 22, including Mount Spurr, Redoubt Volcano, Iliamna Volcano, Augustine Volcano, Mount Snowy, Mount Griggs, Mount Katmai, Novarupta, Trident Volcano, Mount Mageik, Mount Martin, Aniakchak Crater, Pavlof Volcano, Mount Dutton, Isanotski volcano, Shisaldin Volcano, Fisher Caldera, Westdahl volcano, Akutan volcano, Makushin Volcano, Great Sitkin volcano, and Kanaga Volcano (see Figures 1-15). The network expansion also increased the number of earthquakes located from about 600 per year in1994 and 1995 to about 3000 per year between 1997 and 1999.Highlights of the catalog period include: 1) a large volcanogenic seismic

  4. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  5. Monitoring of ULF (ultra-low-frequency) Geomagnetic Variations Associated with Earthquakes

    PubMed Central

    Hayakawa, Masashi; Hattori, Katsumi; Ohta, Kenji

    2007-01-01

    ULF (ultra-low-frequency) electromagnetic emission is recently recognized as one of the most promising candidates for short-term earthquake prediction. This paper reviews previous convincing evidence on the presence of ULF emissions before a few large earthquakes. Then, we present our network of ULF monitoring in the Tokyo area by describing our ULF magnetic sensors and we finally present a few, latest results on seismogenic electromagnetic emissions for recent large earthquakes with the use of sophisticated signal processings.

  6. The August 1st, 2014 ( M w 5.3) Moderate Earthquake: Evidence for an Active Thrust Fault in the Bay of Algiers (Algeria)

    NASA Astrophysics Data System (ADS)

    Benfedda, A.; Abbes, K.; Bouziane, D.; Bouhadad, Y.; Slimani, A.; Larbes, S.; Haddouche, D.; Bezzeghoud, M.

    2017-03-01

    On August 1st, 2014, a moderate-sized earthquake struck the capital city of Algiers at 05:11:17.6 (GMT+1). The earthquake caused the death of six peoples and injured 420, mainly following a panic movement among the population. Following the main shock, we surveyed the aftershock activity using a portable seismological network (short period), installed from August 2nd, 2014 to August 21st, 2015. In this work, first, we determined the main shock epicenter using the accelerograms recorded by the Algerian accelerograph network (under the coordination of the National Center of Applied Research in Earthquake Engineering-CGS). We calculated the focal mechanism of the main shock, using the inversion of the accelerograph waveforms in displacement that provides a reverse fault with a slight right-lateral component of slip and a compression axis striking NNW-SSE. The obtained scalar seismic moment ( M o = 1.25 × 1017 Nm) corresponds to a moment magnitude of M w = 5.3. Second, the analysis of the obtained aftershock swarm, of the survey, suggests an offshore ENE-WSW, trending and NNW dipping, causative active fault in the bay of Algiers, which may likely correspond to an offshore unknown segment of the Sahel active fault.

  7. Hydrogels for Engineering of Perfusable Vascular Networks

    PubMed Central

    Liu, Juan; Zheng, Huaiyuan; Poh, Patrina S. P.; Machens, Hans-Günther; Schilling, Arndt F.

    2015-01-01

    Hydrogels are commonly used biomaterials for tissue engineering. With their high-water content, good biocompatibility and biodegradability they resemble the natural extracellular environment and have been widely used as scaffolds for 3D cell culture and studies of cell biology. The possible size of such hydrogel constructs with embedded cells is limited by the cellular demand for oxygen and nutrients. For the fabrication of large and complex tissue constructs, vascular structures become necessary within the hydrogels to supply the encapsulated cells. In this review, we discuss the types of hydrogels that are currently used for the fabrication of constructs with embedded vascular networks, the key properties of hydrogels needed for this purpose and current techniques to engineer perfusable vascular structures into these hydrogels. We then discuss directions for future research aimed at engineering of vascularized tissue for implantation. PMID:26184185

  8. Fault structure in the Nepal Himalaya as illuminated by aftershocks of the 2015 Mw 7.8 Gorkha earthquake recorded by the local NAMASTE network

    NASA Astrophysics Data System (ADS)

    Ghosh, A.; Mendoza, M.; LI, B.; Karplus, M. S.; Nabelek, J.; Sapkota, S. N.; Adhikari, L. B.; Klemperer, S. L.; Velasco, A. A.

    2017-12-01

    Geometry of the Main Himalayan Thrust (MHT), that accommodates majority of the plate motion between Indian and Eurasian plate, is being debated for a long time. Different models have been proposed; some of them are significantly different from others. Obtaining a well constrained geometry of the MHT is challenging mainly because of the lack of high quality data, inherent low resolution and non-uniqueness of the models. We used a dense local seismic network - NAMASTE - to record and analyze a prolific aftershock sequence following the 2015 Mw 7.8 Gorkha earthquake, and determine geometry of the MHT constrained by precisely located well-constrained aftershocks. We detected and located more than 15,000 aftershocks of the Gorkha earthquake using Hypoinverse and then relatively relocated using HypoDD algorithm. We selected about 7,000 earthquakes that are particularly well constrained to analyze the geometry of the megathrust. They illuminate fault structure in this part of the Himalaya with unprecedented detail. The MHT shows two subhorizontal planes connected by a duplex structure. The duplex structure is characterized by multiple steeply dipping planes. In addition, we used four large-aperture continental-scale seismic arrays at teleseismic distances to backproject high-frequency seismic radiation. Moreover, we combined all arrays to significantly increase the resolution and detectability. We imaged rupture propagation of the mainshock showing complexity near the end of the rupture that might help arresting of the rupture to the east. Furthermore, we continuously scanned teleseismic data for two weeks starting from immediately after the mainshock to detect and locate aftershock activity only using the arrays. Spatial pattern of the aftershocks was similar to the existing global catalog using conventional seismic network and technique. However, we detected more than twice as many aftershocks using the array technique compared to the global catalog including many

  9. 76 FR 46359 - Announcing the Nineteenth Public Meeting of the Crash Injury Research and Engineering Network...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-02

    ... Nineteenth Public Meeting of the Crash Injury Research and Engineering Network (CIREN) AGENCY: National... announces the Nineteenth Public Meeting of members of the Crash Injury Research and Engineering Network... of centers, medical and engineering. Medical centers are based at Level I Trauma Centers that admit...

  10. Variable neighborhood search for reverse engineering of gene regulatory networks.

    PubMed

    Nicholson, Charles; Goodwin, Leslie; Clark, Corey

    2017-01-01

    A new search heuristic, Divided Neighborhood Exploration Search, designed to be used with inference algorithms such as Bayesian networks to improve on the reverse engineering of gene regulatory networks is presented. The approach systematically moves through the search space to find topologies representative of gene regulatory networks that are more likely to explain microarray data. In empirical testing it is demonstrated that the novel method is superior to the widely employed greedy search techniques in both the quality of the inferred networks and computational time. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  12. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  13. e-Science on Earthquake Disaster Mitigation by EUAsiaGrid

    NASA Astrophysics Data System (ADS)

    Yen, Eric; Lin, Simon; Chen, Hsin-Yen; Chao, Li; Huang, Bor-Shoh; Liang, Wen-Tzong

    2010-05-01

    Although earthquake is not predictable at this moment, with the aid of accurate seismic wave propagation analysis, we could simulate the potential hazards at all distances from possible fault sources by understanding the source rupture process during large earthquakes. With the integration of strong ground-motion sensor network, earthquake data center and seismic wave propagation analysis over gLite e-Science Infrastructure, we could explore much better knowledge on the impact and vulnerability of potential earthquake hazards. On the other hand, this application also demonstrated the e-Science way to investigate unknown earth structure. Regional integration of earthquake sensor networks could aid in fast event reporting and accurate event data collection. Federation of earthquake data center entails consolidation and sharing of seismology and geology knowledge. Capability building of seismic wave propagation analysis implies the predictability of potential hazard impacts. With gLite infrastructure and EUAsiaGrid collaboration framework, earth scientists from Taiwan, Vietnam, Philippine, Thailand are working together to alleviate potential seismic threats by making use of Grid technologies and also to support seismology researches by e-Science. A cross continental e-infrastructure, based on EGEE and EUAsiaGrid, is established for seismic wave forward simulation and risk estimation. Both the computing challenge on seismic wave analysis among 5 European and Asian partners, and the data challenge for data center federation had been exercised and verified. Seismogram-on-Demand service is also developed for the automatic generation of seismogram on any sensor point to a specific epicenter. To ease the access to all the services based on users workflow and retain the maximal flexibility, a Seismology Science Gateway integating data, computation, workflow, services and user communities would be implemented based on typical use cases. In the future, extension of the

  14. Update on the Center for Engineering Strong Motion Data

    NASA Astrophysics Data System (ADS)

    Haddadi, H. R.; Shakal, A. F.; Stephens, C. D.; Oppenheimer, D. H.; Huang, M.; Leith, W. S.; Parrish, J. G.; Savage, W. U.

    2010-12-01

    The U.S. Geological Survey (USGS) and the California Geological Survey (CGS) established the Center for Engineering Strong-Motion Data (CESMD, Center) to provide a single access point for earthquake strong-motion records and station metadata from the U.S. and international strong-motion programs. The Center has operational facilities in Sacramento and Menlo Park, California, to receive, process, and disseminate records through the CESMD web site at www.strongmotioncenter.org. The Center currently is in the process of transitioning the COSMOS Virtual Data Center (VDC) to integrate its functions with those of the CESMD for improved efficiency of operations, and to provide all users with a more convenient one-stop portal to both U.S. and important international strong-motion records. The Center is working with COSMOS and international and U.S. data providers to improve the completeness of site and station information, which are needed to most effectively employ the recorded data. The goal of all these and other new developments is to continually improve access by the earthquake engineering community to strong-motion data and metadata world-wide. The CESMD and its Virtual Data Center (VDC) provide tools to map earthquakes and recording stations, to search raw and processed data, to view time histories and spectral plots, to convert data files formats, and to download data and a variety of information. The VDC is now being upgraded to convert the strong-motion data files from different seismic networks into a common standard tagged format in order to facilitate importing earthquake records and station metadata to the CESMD database. An important new feature being developed is the automatic posting of Internet Quick Reports at the CESMD web site. This feature will allow users, and emergency responders in particular, to view strong-motion waveforms and download records within a few minutes after an earthquake occurs. Currently the CESMD and its Virtual Data Center provide

  15. Analysing the Correlation between Social Network Analysis Measures and Performance of Students in Social Network-Based Engineering Education

    ERIC Educational Resources Information Center

    Putnik, Goran; Costa, Eric; Alves, Cátia; Castro, Hélio; Varela, Leonilde; Shah, Vaibhav

    2016-01-01

    Social network-based engineering education (SNEE) is designed and implemented as a model of Education 3.0 paradigm. SNEE represents a new learning methodology, which is based on the concept of social networks and represents an extended model of project-led education. The concept of social networks was applied in the real-life experiment,…

  16. Towards an Earthquake and Tsunami Early Warning in the Caribbean

    NASA Astrophysics Data System (ADS)

    Huerfano Moreno, V. A.; Vanacore, E. A.

    2017-12-01

    The Caribbean region (CR) has a documented history of large damaging earthquakes and tsunamis that have affected coastal areas, including the events of Jamaica in 1692, Virgin Islands in 1867, Puerto Rico in 1918, the Dominican Republic in 1946 and Haiti in 2010. There is clear evidence that tsunamis have been triggered by large earthquakes that deformed the ocean floor around the Caribbean Plate boundary. The CR is monitored jointly by national/regional/local seismic, geodetic and sea level networks. All monitoring institutions are participating in the UNESCO ICG/Caribe EWS, the purpose of this initiative is to minimize loss of life and destruction of property, and to mitigate against catastrophic economic impacts via promoting local research, real time (RT) earthquake, geodetic and sea level data sharing and improving warning capabilities and enhancing education and outreach strategies. Currently more than, 100 broad-band seismic, 65 sea levels and 50 GPS high rate stations are available in real or near real-time. These real-time streams are used by Local/Regional or Worldwide detection and warning institutions to provide earthquake source parameters in a timely manner. Currently, any Caribbean event detected to have a magnitude greater than 4.5 is evaluated, and sea level is measured, by the TWC for tsumanigenic potential. The regional cooperation is motivated both by research interests as well as geodetic, seismic and tsunami hazard monitoring and warning. It will allow the imaging of the tectonic structure of the Caribbean region to a high resolution which will consequently permit further understanding of the seismic source properties for moderate and large events and the application of this knowledge to procedures of civil protection. To reach its goals, the virtual network has been designed following the highest technical standards: BB sensors, 24 bits A/D converters with 140 dB dynamic range, real-time telemetry. Here we will discuss the state of the PR

  17. Applicability of source scaling relations for crustal earthquakes to estimation of the ground motions of the 2016 Kumamoto earthquake

    NASA Astrophysics Data System (ADS)

    Irikura, Kojiro; Miyakoshi, Ken; Kamae, Katsuhiro; Yoshida, Kunikazu; Somei, Kazuhiro; Kurahashi, Susumu; Miyake, Hiroe

    2017-01-01

    A two-stage scaling relationship of the source parameters for crustal earthquakes in Japan has previously been constructed, in which source parameters obtained from the results of waveform inversion of strong motion data are combined with parameters estimated based on geological and geomorphological surveys. A three-stage scaling relationship was subsequently developed to extend scaling to crustal earthquakes with magnitudes greater than M w 7.4. The effectiveness of these scaling relationships was then examined based on the results of waveform inversion of 18 recent crustal earthquakes ( M w 5.4-6.9) that occurred in Japan since the 1995 Hyogo-ken Nanbu earthquake. The 2016 Kumamoto earthquake, with M w 7.0, was one of the largest earthquakes to occur since dense and accurate strong motion observation networks, such as K-NET and KiK-net, were deployed after the 1995 Hyogo-ken Nanbu earthquake. We examined the applicability of the scaling relationships of the source parameters of crustal earthquakes in Japan to the 2016 Kumamoto earthquake. The rupture area and asperity area were determined based on slip distributions obtained from waveform inversion of the 2016 Kumamoto earthquake observations. We found that the relationship between the rupture area and the seismic moment for the 2016 Kumamoto earthquake follows the second-stage scaling within one standard deviation ( σ = 0.14). The ratio of the asperity area to the rupture area for the 2016 Kumamoto earthquake is nearly the same as ratios previously obtained for crustal earthquakes. Furthermore, we simulated the ground motions of this earthquake using a characterized source model consisting of strong motion generation areas (SMGAs) based on the empirical Green's function (EGF) method. The locations and areas of the SMGAs were determined through comparison between the synthetic ground motions and observed motions. The sizes of the SMGAs were nearly coincident with the asperities with large slip. The synthetic

  18. Seismic databases and earthquake catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, Tea; Javakhishvili, Zurab; Tvaradze, Nino; Tumanova, Nino; Jorjiashvili, Nato; Gok, Rengen

    2016-04-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283, Ms~7.0, Io=9; Lechkhumi-Svaneti earthquake of 1350, Ms~7.0, Io=9; and the Alaverdi(earthquake of 1742, Ms~6.8, Io=9. Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088, Ms~6.5, Io=9 and the Akhalkalaki earthquake of 1899, Ms~6.3, Io =8-9. Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; 1991 Ms=7.0 Racha earthquake, the largest event ever recorded in the region; the 1992 M=6.5 Barisakho earthquake; Ms=6.9 Spitak, Armenia earthquake (100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of various national networks (Georgia (~25 stations), Azerbaijan (~35 stations), Armenia (~14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. A catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences, Ilia State University). The catalog consists of more then 80,000 events. Together with our colleagues from Armenia, Azerbaijan and Turkey the database for the Caucasus seismic events was compiled. We tried to improve locations of the events and calculate Moment magnitudes for the events more than magnitude 4 estimate in order to obtain unified magnitude catalogue of the region. The results will serve as the input for the Seismic hazard assessment for the region.

  19. The Northern California Earthquake Management System: A Unified System From Realtime Monitoring to Data Distribution

    NASA Astrophysics Data System (ADS)

    Neuhauser, D.; Dietz, L.; Lombard, P.; Klein, F.; Zuzlewski, S.; Kohler, W.; Hellweg, M.; Luetgert, J.; Oppenheimer, D.; Romanowicz, B.

    2006-12-01

    The longstanding cooperation between the USGS Menlo Park and UC Berkeley's Seismological Laboratory for monitoring earthquakes and providing data to the research community is achieving a new level of integration. While station support and data collection for each network (NC, BK, BP) remain the responsibilities of the host institution, picks, codas and amplitudes will be produced and shared between the data centers continuously. Thus, realtime earthquake processing from triggering and locating through magnitude and moment tensor calculation and Shakemap production will take place independently at both locations, improving the robustness of event reporting in the Northern California Earthquake Management Center. Parametric data will also be exchanged with the Southern California Earthquake Management System to allow statewide earthquake detection and processing for further redundancy within the California Integrated Seismic Network (CISN). The database plays an integral part in this system, providing the coordination for event processing as well as the repository for event, instrument (metadata) and waveform information. The same master database serves both realtime processing, data quality control and archival, and the data center which provides waveforms and earthquake data to users in the research community. Continuous waveforms from all BK, BP, and NC stations, event waveform gathers, and event information automatically become available at the Northern California Earthquake Data Center (NCEDC). Currently, the NCEDC is collecting and makes available over 4 TByes of data per year from the NCEMC stations and other seismic networks, as well as from GPS and and other geophysical instrumentation.

  20. Strong-motion observations of the M 7.8 Gorkha, Nepal, earthquake sequence and development of the N-shake strong-motion network

    USGS Publications Warehouse

    Dixit, Amod; Ringler, Adam; Sumy, Danielle F.; Cochran, Elizabeth S.; Hough, Susan E.; Martin, Stacey; Gibbons, Steven; Luetgert, James H.; Galetzka, John; Shrestha, Surya; Rajaure, Sudhir; McNamara, Daniel E.

    2015-01-01

    We present and describe strong-motion data observations from the 2015 M 7.8 Gorkha, Nepal, earthquake sequence collected using existing and new Quake-Catcher Network (QCN) and U.S. Geological Survey NetQuakes sensors located in the Kathmandu Valley. A comparison of QCN data with waveforms recorded by a conventional strong-motion (NetQuakes) instrument validates the QCN data. We present preliminary analysis of spectral accelerations, and peak ground acceleration and velocity for earthquakes up to M 7.3 from the QCN stations, as well as preliminary analysis of the mainshock recording from the NetQuakes station. We show that mainshock peak accelerations were lower than expected and conclude the Kathmandu Valley experienced a pervasively nonlinear response during the mainshock. Phase picks from the QCN and NetQuakes data are also used to improve aftershock locations. This study confirms the utility of QCN instruments to contribute to ground-motion investigations and aftershock response in regions where conventional instrumentation and open-access seismic data are limited. Initial pilot installations of QCN instruments in 2014 are now being expanded to create the Nepal–Shaking Hazard Assessment for Kathmandu and its Environment (N-SHAKE) network.

  1. Generalized networking engineering: optimal pricing and routing in multiservice networks

    NASA Astrophysics Data System (ADS)

    Mitra, Debasis; Wang, Qiong

    2002-07-01

    One of the functions of network engineering is to allocate resources optimally to forecasted demand. We generalize the mechanism by incorporating price-demand relationships into the problem formulation, and optimizing pricing and routing jointly to maximize total revenue. We consider a network, with fixed topology and link bandwidths, that offers multiple services, such as voice and data, each having characteristic price elasticity of demand, and quality of service and policy requirements on routing. Prices, which depend on service type and origin-destination, determine demands, that are routed, subject to their constraints, so as to maximize revenue. We study the basic properties of the optimal solution and prove that link shadow costs provide the basis for both optimal prices and optimal routing policies. We investigate the impact of input parameters, such as link capacities and price elasticities, on prices, demand growth, and routing policies. Asymptotic analyses, in which network bandwidth is scaled to grow, give results that are noteworthy for their qualitative insights. Several numerical examples illustrate the analyses.

  2. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    NASA Astrophysics Data System (ADS)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  3. Performance of Real-time Earthquake Information System in Japan

    NASA Astrophysics Data System (ADS)

    Nakamura, H.; Horiuchi, S.; Wu, C.; Yamamoto, S.

    2008-12-01

    Horiuchi et al. (2005) developed a real-time earthquake information system (REIS) using Hi-net, a densely deployed nationwide seismic network, which consists of about 800 stations operated by NIED, Japan. REIS determines hypocenter locations and earthquake magnitudes automatically within a few seconds after P waves arrive at the closest station and calculates focal mechanisms within about 15 seconds. Obtained hypocenter parameters are transferred immediately by using XML format to a computer in Japan Meteorological Agency (JMA), who started the service of EEW to special users in June 2005. JMA also developed EEW using 200 stations. The results by the two systems are merged. Among all the first issued EEW reports by both systems, REIS information accounts for about 80 percent. This study examines the rapidity and credibility of REIS by analyzing the 4050 earthquakes which occurred around the Japan Islands since 2005 with magnitude larger than 3.0. REIS re-determines hypocenter parameters every one second according to the revision of waveform data. Here, we discuss only about the results by the first reports. On rapidness, our results show that about 44 percent of the first reports are issued within 5 seconds after the P waves arrives at the closest stations. Note that this 5-second time window includes time delay due to data package and transmission delay of about 2 seconds. REIS waits till two stations detect P waves for events in the network but four stations outside the network so as to get reliable solutions. For earthquakes with hypocentral distance less than 100km, 55 percent of earthquakes are warned in 5 seconds and 87 percent are warned in 10 seconds. Most of events having long time delay are small and triggered by S wave arrivals. About 80 percent of events have difference in epicenter distances less than 20km relative to JMA manually determined locations. Because of the existence of large lateral heterogeneity in seismic velocity, the difference depends

  4. The Colombia Seismological Network

    NASA Astrophysics Data System (ADS)

    Blanco Chia, J. F.; Poveda, E.; Pedraza, P.

    2013-05-01

    The latest seismological equipment and data processing instrumentation installed at the Colombia Seismological Network (RSNC) are described. System configuration, network operation, and data management are discussed. The data quality and the new seismological products are analyzed. The main purpose of the network is to monitor local seismicity with a special emphasis on seismic activity surrounding the Colombian Pacific and Caribbean oceans, for early warning in case a Tsunami is produced by an earthquake. The Colombian territory is located at the South America northwestern corner, here three tectonic plates converge: Nazca, Caribbean and the South American. The dynamics of these plates, when resulting in earthquakes, is continuously monitored by the network. In 2012, the RSNC registered in 2012 an average of 67 events per day; from this number, a mean of 36 earthquakes were possible to be located well. In 2010 the network was also able to register an average of 67 events, but it was only possible to locate a mean of 28 earthquakes daily. This difference is due to the expansion of the network. The network is made up of 84 stations equipped with different kind of broadband 40s, 120s seismometers, accelerometers and short period 1s sensors. The signal is transmitted continuously in real-time to the Central Recording Center located at Bogotá, using satellite, telemetry, and Internet. Moreover, there are some other stations which are required to collect the information in situ. Data is recorded and processed digitally using two different systems, EARTHWORM and SEISAN, which are able to process and share the information between them. The RSNC has designed and implemented a web system to share the seismological data. This innovative system uses tools like Java Script, Oracle and programming languages like PHP to allow the users to access the seismicity registered by the network almost in real time as well as to download the waveform and technical details. The coverage

  5. NASA/DOD Aerospace Knowledge Diffusion Research Project. Paper 39: The role of computer networks in aerospace engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Ann P.; Pinelli, Thomas E.

    1994-01-01

    This paper presents selected results from an empirical investigation into the use of computer networks in aerospace engineering. Such networks allow aerospace engineers to communicate with people and access remote resources through electronic mail, file transfer, and remote log-in. The study drew its subjects from private sector, government and academic organizations in the U.S. aerospace industry. Data presented here were gathered in a mail survey, conducted in Spring 1993, that was distributed to aerospace engineers performing a wide variety of jobs. Results from the mail survey provide a snapshot of the current use of computer networks in the aerospace industry, suggest factors associated with the use of networks, and identify perceived impacts of networks on aerospace engineering work and communication.

  6. USGS contributions to earthquake and tsunami monitoring in the Caribbean Region

    NASA Astrophysics Data System (ADS)

    McNamara, D.; Caribbean Project Team, U.; Partners, C.

    2007-05-01

    USGS Caribbean Project Team: Lind Gee, Gary Gyure, John Derr, Jack Odum, John McMillan, David Carver, Jim Allen, Susan Rhea, Don Anderson, Harley Benz Caribbean Partners: Christa von Hillebrandt-Andrade-PRSN, Juan Payero ISU-UASD,DR, Eduardo Camacho - UPAN, Panama, Lloyd Lynch - SRU,Gonzalo Cruz - UNAH,Honduras, Margaret Wiggins-Grandison - Jamaica, Judy Thomas - CERO Barbados, Sylvan McIntyre - NADMA Grenada, E. Bermingham - STRI. The magnitude-9 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness of the destructive hazard posed by earthquakes and tsunamis. In response to this tragedy, the US government undertook a collaborative project to improve earthquake and tsunami monitoring along a major portion of vulnerable coastal regions, in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Seismically active areas of the Caribbean Sea region pose a tsunami risk for Caribbean islands, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North America. Nearly 100 tsunamis have been reported for the Caribbean region in the past 500 years, including 14 tsunamis reported in Puerto Rico and the U.S. Virgin Islands. Partners in this project include the United States Geological Survey (USGS), the Smithsonian Institute, the National Oceanic and Aeronautic Administration (NOAA), and several partner institutions in the Caribbean region. This presentation focuses on the deployment of nine broadband seismic stations to monitor earthquake activity in the Caribbean region that are affiliated with the Global Seismograph Network (GSN). By the end of 2006, five stations were transmitting data to the USGS National Earthquake Information Service (NEIS), and regional partners through Puerto Rico seismograph network (PRSN) Earthworm systems. The following stations are currently operating: SDDR - Sabaneta Dam Dominican Republic, BBGH - Gun Hill Barbados, GRGR - Grenville, Grenada, BCIP - Barro Colorado, Panama, TGUH - Tegucigalpa

  7. Earthquake hazards in the Alaska transportation corridors

    DOT National Transportation Integrated Search

    1983-03-01

    Based on observations made by modern seismographic networks since 1967, and taking into consideration historical records of large Alaskan earthquakes in the past, it is judged that the hazards faced by transportation corridors in different areas of t...

  8. Delineating Concealed Faults within Cogdell Oil Field via Earthquake Detection

    NASA Astrophysics Data System (ADS)

    Aiken, C.; Walter, J. I.; Brudzinski, M.; Skoumal, R.; Savvaidis, A.; Frohlich, C.; Borgfeldt, T.; Dotray, P.

    2016-12-01

    Cogdell oil field, located within the Permian Basin of western Texas, has experienced several earthquakes ranging from magnitude 1.7 to 4.6, most of which were recorded since 2006. Using the Earthscope USArray, Gan and Frohlich [2013] relocated some of these events and found a positive correlation in the timing of increased earthquake activity and increased CO2 injection volume. However, focal depths of these earthquakes are unknown due to 70 km station spacing of the USArray. Accurate focal depths as well as new detections can delineate subsurface faults and establish whether earthquakes are occurring in the shallow sediments or in the deeper basement. To delineate subsurface fault(s) in this region, we first detect earthquakes not currently listed in the USGS catalog by applying continuous waveform-template matching algorithms to multiple seismic data sets. We utilize seismic data spanning the time frame of 2006 to 2016 - which includes data from the U.S. Geological Survey Global Seismographic Network, the USArray, and the Sweetwater, TX broadband and nodal array located 20-40 km away. The catalog of earthquakes enhanced by template matching reveals events that were well recorded by the large-N Sweetwater array, so we are experimenting with strategies for optimizing template matching using different configurations of many stations. Since earthquake activity in the Cogdell oil field is on-going (a magnitude 2.6 occurred on May 29, 2016), a temporary deployment of TexNet seismometers has been planned for the immediate vicinity of Cogdell oil field in August 2016. Results on focal depths and detection of small magnitude events are pending this small local network deployment.

  9. Space shuttle main engine fault detection using neural networks

    NASA Technical Reports Server (NTRS)

    Bishop, Thomas; Greenwood, Dan; Shew, Kenneth; Stevenson, Fareed

    1991-01-01

    A method for on-line Space Shuttle Main Engine (SSME) anomaly detection and fault typing using a feedback neural network is described. The method involves the computation of features representing time-variance of SSME sensor parameters, using historical test case data. The network is trained, using backpropagation, to recognize a set of fault cases. The network is then able to diagnose new fault cases correctly. An essential element of the training technique is the inclusion of randomly generated data along with the real data, in order to span the entire input space of potential non-nominal data.

  10. 33 CFR 222.4 - Reporting earthquake effects.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... structural integrity and operational adequacy of major Civil Works structures following the occurrence of...) Applicability. This regulation is applicable to all field operating agencies having Civil Works responsibilities...

  11. 2001 Bhuj, India, earthquake engineering seismoscope recordings and Eastern North America ground-motion attenuation relations

    USGS Publications Warehouse

    Cramer, C.H.; Kumar, A.

    2003-01-01

    Engineering seismoscope data collected at distances less than 300 km for the M 7.7 Bhuj, India, mainshock are compatible with ground-motion attenuation in eastern North America (ENA). The mainshock ground-motion data have been corrected to a common geological site condition using the factors of Joyner and Boore (2000) and a classification scheme of Quaternary or Tertiary sediments or rock. We then compare these data to ENA ground-motion attenuation relations. Despite uncertainties in recording method, geological site corrections, common tectonic setting, and the amount of regional seismic attenuation, the corrected Bhuj dataset agrees with the collective predictions by ENA ground-motion attenuation relations within a factor of 2. This level of agreement is within the dataset uncertainties and the normal variance for recorded earthquake ground motions.

  12. Towards coupled earthquake dynamic rupture and tsunami simulations: The 2011 Tohoku earthquake.

    NASA Astrophysics Data System (ADS)

    Galvez, Percy; van Dinther, Ylona

    2016-04-01

    The 2011 Mw9 Tohoku earthquake has been recorded with a vast GPS and seismic network given an unprecedented chance to seismologists to unveil complex rupture processes in a mega-thrust event. The seismic stations surrounding the Miyagi regions (MYGH013) show two clear distinct waveforms separated by 40 seconds suggesting two rupture fronts, possibly due to slip reactivation caused by frictional melting and thermal fluid pressurization effects. We created a 3D dynamic rupture model to reproduce this rupture reactivation pattern using SPECFEM3D (Galvez et al, 2014) based on a slip-weakening friction with sudden two sequential stress drops (Galvez et al, 2015) . Our model starts like a M7-8 earthquake breaking dimly the trench, then after 40 seconds a second rupture emerges close to the trench producing additional slip capable to fully break the trench and transforming the earthquake into a megathrust event. The seismograms agree roughly with seismic records along the coast of Japan. The resulting sea floor displacements are in agreement with 1Hz GPS displacements (GEONET). The simulated sea floor displacement reaches 8-10 meters of uplift close to the trench, which may be the cause of such a devastating tsunami followed by the Tohoku earthquake. To investigate the impact of such a huge uplift, we ran tsunami simulations with the slip reactivation model and plug the sea floor displacements into GeoClaw (Finite element code for tsunami simulations, George and LeVeque, 2006). Our recent results compare well with the water height at the tsunami DART buoys 21401, 21413, 21418 and 21419 and show the potential using fully dynamic rupture results for tsunami studies for earthquake-tsunami scenarios.

  13. Earthquakes

    MedlinePlus

    ... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

  14. A Bayesian Approach to Real-Time Earthquake Phase Association

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Earle, P. S.; Patton, J. M.

    2014-12-01

    Real-time location of seismic events requires a robust and extremely efficient means of associating and identifying seismic phases with hypothetical sources. An association algorithm converts a series of phase arrival times into a catalog of earthquake hypocenters. The classical approach based on time-space stacking of the locus of possible hypocenters for each phase arrival using the principal of acoustic reciprocity has been in use now for many years. One of the most significant problems that has emerged over time with this approach is related to the extreme variations in seismic station density throughout the global seismic network. To address this problem we have developed a novel, Bayesian association algorithm, which looks at the association problem as a dynamically evolving complex system of "many to many relationships". While the end result must be an array of one to many relations (one earthquake, many phases), during the association process the situation is quite different. Both the evolving possible hypocenters and the relationships between phases and all nascent hypocenters is many to many (many earthquakes, many phases). The computational framework we are using to address this is a responsive, NoSQL graph database where the earthquake-phase associations are represented as intersecting Bayesian Learning Networks. The approach directly addresses the network inhomogeneity issue while at the same time allowing the inclusion of other kinds of data (e.g., seismic beams, station noise characteristics, priors on estimated location of the seismic source) by representing the locus of intersecting hypothetical loci for a given datum as joint probability density functions.

  15. Hydrogel Bioprinted Microchannel Networks for Vascularization of Tissue Engineering Constructs

    PubMed Central

    Bertassoni, Luiz E.; Cecconi, Martina; Manoharan, Vijayan; Nikkhah, Mehdi; Hjortnaes, Jesper; Cristino, Ana Luiza; Barabaschi, Giada; Demarchi, Danilo; Dokmeci, Mehmet R.; Yang, Yunzhi; Khademhosseini, Ali

    2014-01-01

    Vascularization remains a critical challenge in tissue engineering. The development of vascular networks within densely populated and metabolically functional tissues facilitate transport of nutrients and removal of waste products, thus preserving cellular viability over a long period of time. Despite tremendous progress in fabricating complex tissue constructs in the past few years, approaches for controlled vascularization within hydrogel based engineered tissue constructs have remained limited. Here, we report a three dimensional (3D) micromolding technique utilizing bioprinted agarose template fibers to fabricate microchannel networks with various architectural features within photo cross linkable hydrogel constructs. Using the proposed approach, we were able to successfully embed functional and perfusable microchannels inside methacrylated gelatin (GelMA), star poly (ethylene glycol-co-lactide) acrylate (SPELA), poly (ethylene glycol) dimethacrylate (PEGDMA) and poly (ethylene glycol) diacrylate (PEGDA) hydrogels at different concentrations. In particular, GelMA hydrogels were used as a model to demonstrate the functionality of the fabricated vascular networks in improving mass transport, cellular viability and differentiation within the cell-laden tissue constructs. In addition, successful formation of endothelial monolayers within the fabricated channels was confirmed. Overall, our proposed strategy represents an effective technique for vascularization of hydrogel constructs with useful applications in tissue engineering and organs on a chip. PMID:24860845

  16. Engineering education research: Impacts of an international network of female engineers on the persistence of Liberian undergraduate women studying engineering

    NASA Astrophysics Data System (ADS)

    Rimer, Sara; Reddivari, Sahithya; Cotel, Aline

    2015-11-01

    As international efforts to educate and empower women continue to rise, engineering educators are in a unique position to be a part of these efforts by encouraging and supporting women across the world at the university level through STEM education and outreach. For the past two years, the University of Michigan has been a part of a grassroots effort to encourage and support the persistence of engineering female students at University of Liberia. This effort has led to the implementation of a leadership camp this past August for Liberian engineering undergraduate women, meant to: (i) to empower engineering students with the skills, support, and inspiration necessary to become successful and well-rounded engineering professionals in a global engineering market; and (ii) to strengthen the community of Liberian female engineers by building cross-cultural partnerships among students resulting in a international network of women engineers. This session will present qualitative research findings on the impact of this grassroots effort on Liberian female students? persistence in engineering, and the future directions of this work.

  17. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  18. Real-time earthquake monitoring: Early warning and rapid response

    NASA Technical Reports Server (NTRS)

    1991-01-01

    A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.

  19. POST Earthquake Debris Management — AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  20. 77 FR 46154 - Announcing the Twentieth Public Meeting of the Crash Injury Research and Engineering Network (CIREN)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ... Twentieth Public Meeting of the Crash Injury Research and Engineering Network (CIREN) AGENCY: National... announces the Twentieth Public Meeting of members of the Crash Injury Research and Engineering Network... of centers, medical and engineering. Medical centers are based at Level I Trauma Centers that admit...

  1. Napa earthquake: An earthquake in a highly connected world

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.

    2014-12-01

    The Napa earthquake recently occurred close to Silicon Valley. This makes it a good candidate to study what social networks, wearable objects and website traffic analysis (flashsourcing) can tell us about the way eyewitnesses react to ground shaking. In the first part, we compare the ratio of people publishing tweets and with the ratio of people visiting EMSC (European Mediterranean Seismological Centre) real time information website in the first minutes following the earthquake occurrence to the results published by Jawbone, which show that the proportion of people waking up depends (naturally) on the epicentral distance. The key question to evaluate is whether the proportions of inhabitants tweeting or visiting the EMSC website are similar to the proportion of people waking up as shown by the Jawbone data. If so, this supports the premise that all methods provide a reliable image of the relative ratio of people waking up. The second part of the study focuses on the reaction time for both Twitter and EMSC website access. We show, similarly to what was demonstrated for the Mineral, Virginia, earthquake (Bossu et al., 2014), that hit times on the EMSC website follow the propagation of the P waves and that 2 minutes of website traffic is sufficient to determine the epicentral location of an earthquake on the other side of the Atlantic. We also compare with the publication time of messages on Twitter. Finally, we check whether the number of tweets and the number of visitors relative to the number of inhabitants is correlated to the local level of shaking. Together these results will tell us whether the reaction of eyewitnesses to ground shaking as observed through Twitter and the EMSC website analysis is tool specific (i.e. specific to Twitter or EMSC website) or whether they do reflect people's actual reactions.

  2. 8 March 2010 Elazığ-Kovancilar (Turkey) Earthquake: observations on ground motions and building damage

    USGS Publications Warehouse

    Akkar, Sinan; Aldemir, A.; Askan, A.; Bakir, S.; Canbay, E.; Demirel, I.O.; Erberik, M.A.; Gulerce, Z.; Gulkan, Polat; Kalkan, Erol; Prakash, S.; Sandikkaya, M.A.; Sevilgen, V.; Ugurhan, B.; Yenier, E.

    2011-01-01

    An earthquake of MW = 6.1 occurred in the Elazığ region of eastern Turkey on 8 March 2010 at 02:32:34 UTC. The United States Geological Survey (USGS) reported the epicenter of the earthquake as 38.873°N-39.981°E with a focal depth of 12 km. Forty-two people lost their lives and 137 were injured during the event. The earthquake was reported to be on the left-lateral strike-slip east Anatolian fault (EAF), which is one of the two major active fault systems in Turkey. Teams from the Earthquake Engineering Research Center of the Middle East Technical University (EERC-METU) visited the earthquake area in the aftermath of the mainshock. Their reconnaissance observations were combined with interpretations of recorded ground motions for completeness. This article summarizes observations on building and ground damage in the area and provides a discussion of the recorded motions. No significant observations in terms of geotechnical engineering were made.

  3. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  4. Reconnaissance engineering geology of the Metlakatla area, Annette Island, Alaska, with emphasis on evaluation of earthquakes and other geologic hazards

    USGS Publications Warehouse

    Yehle, Lynn A.

    1977-01-01

    A program to study the engineering geology of most larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about the Metlakatla area, Annette Island, is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are tentative. Landscape of the Metlakatla Peninsula, on which the city of Metlakatla is located, is characterized by a muskeg-covered terrane of very low relief. In contrast, most of the rest of Annette Island is composed of mountainous terrane with steep valleys and numerous lakes. During the Pleistocene Epoch the Metlakatla area was presumably covered by ice several times; glaciers smoothed the present Metlakatla Peninsula and deeply eroded valleys on the rest. of Annette Island. The last major deglaciation was completed probably before 10,000 years ago. Rebound of the earth's crust, believed to be related to glacial melting, has caused land emergence at Metlakatla of at least 50 ft (15 m) and probably more than 200 ft (61 m) relative to present sea level. Bedrock in the Metlakatla area is composed chiefly of hard metamorphic rocks: greenschist and greenstone with minor hornfels and schist. Strike and dip of beds are generally variable and minor offsets are common. Bedrock is of late Paleozoic to early Mesozoic age. Six types of surficial geologic materials of Quaternary age were recognized: firm diamicton, emerged shore, modern shore and delta, and alluvial deposits, very soft muskeg and other organic deposits, and firm to soft artificial fill. A combination map unit is composed of bedrock or diamicton. Geologic structure in southeastern Alaska is complex because, since at least early Paleozoic time, there have been several cycles of tectonic deformation that affected different parts of the region. Southeastern Alaska is transected by numerous faults and possible faults that attest to major

  5. NASA/DOD Aerospace Knowledge Diffusion Research Project. Report 35: The use of computer networks in aerospace engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Ann P.; Pinelli, Thomas E.

    1995-01-01

    This research used survey research to explore and describe the use of computer networks by aerospace engineers. The study population included 2000 randomly selected U.S. aerospace engineers and scientists who subscribed to Aerospace Engineering. A total of 950 usable questionnaires were received by the cutoff date of July 1994. Study results contribute to existing knowledge about both computer network use and the nature of engineering work and communication. We found that 74 percent of mail survey respondents personally used computer networks. Electronic mail, file transfer, and remote login were the most widely used applications. Networks were used less often than face-to-face interactions in performing work tasks, but about equally with reading and telephone conversations, and more often than mail or fax. Network use was associated with a range of technical, organizational, and personal factors: lack of compatibility across systems, cost, inadequate access and training, and unwillingness to embrace new technologies and modes of work appear to discourage network use. The greatest positive impacts from networking appear to be increases in the amount of accurate and timely information available, better exchange of ideas across organizational boundaries, and enhanced work flexibility, efficiency, and quality. Involvement with classified or proprietary data and type of organizational structure did not distinguish network users from nonusers. The findings can be used by people involved in the design and implementation of networks in engineering communities to inform the development of more effective networking systems, services, and policies.

  6. The HayWired Earthquake Scenario

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  7. MyShake: Initial Observations from a Global Smartphone Seismic Network

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.

    2016-12-01

    MyShake is a global smartphone seismic network that harnesses the power of crowdsourcing. It has two component: an android application running on the personal smartphones to detect earthquake-like motion, and a network detection algorithm to aggregate results from multiple smartphones to detect earthquakes. The MyShake application was released to the public on Feb 12th 2016. Within the first 5 months, there are more than 200 earthquakes recorded by the smartphones all over the world, including events in Chile, Argentina, Mexico, Morocco, Greece, Nepal, New Zealand, Taiwan, Japan, and across North America. In this presentation, we will show the waveforms we recorded from the smartphones for different earthquakes, and the evidences for using this data as a supplementary to the current earthquake early warning system. We will also show the performance of MyShake system during the some earthquakes in US. In short, MyShake smartphone seismic network can be a nice complementary system to the current traditional seismic network, at the same time, it can be a standalone system in places where few seismic stations were installed to reduce the earthquake hazards.

  8. New tools for non-invasive exploration of collagen network in cartilaginous tissue-engineered substitute.

    PubMed

    Henrionnet, Christel; Dumas, Dominique; Hupont, Sébastien; Stoltz, Jean François; Mainard, Didier; Gillet, Pierre; Pinzano, Astrid

    2017-01-01

    In tissue engineering approaches, the quality of substitutes is a key element to determine its ability to treat cartilage defects. However, in clinical practice, the evaluation of tissue-engineered cartilage substitute quality is not possible due to the invasiveness of the standard procedure, which is to date histology. The aim of this work was to validate a new innovative system performed from two-photon excitation laser adapted to an optical macroscope to evaluate at macroscopic scale the collagen network in cartilage tissue-engineered substitutes in confrontation with gold standard histologic techniques or immunohistochemistry to visualize type II collagen. This system permitted to differentiate the quality of collagen network between ITS and TGF-β1 treatments. Multiscale large field imaging combined to multimodality approaches (SHG-TCSPC) at macroscopical scale represent an innovative and non-invasive technique to monitor the quality of collagen network in cartilage tissue-engineered substitutes before in vivo implantation.

  9. Developmental engineering: a new paradigm for the design and manufacturing of cell-based products. Part II: from genes to networks: tissue engineering from the viewpoint of systems biology and network science.

    PubMed

    Lenas, Petros; Moos, Malcolm; Luyten, Frank P

    2009-12-01

    The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has

  10. 78 FR 52605 - Announcing the Twenty First Public Meeting of the Crash Injury Research and Engineering Network...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-23

    ... First Public Meeting of the Crash Injury Research and Engineering Network (CIREN) AGENCY: National... announces the Twenty First Public Meeting of members of the Crash Injury Research and Engineering Network... of centers, medical and engineering. Medical centers are based at Level I Trauma Centers that admit...

  11. Seismic Hazard Analysis on a Complex, Interconnected Fault Network

    NASA Astrophysics Data System (ADS)

    Page, M. T.; Field, E. H.; Milner, K. R.

    2017-12-01

    In California, seismic hazard models have evolved from simple, segmented prescriptive models to much more complex representations of multi-fault and multi-segment earthquakes on an interconnected fault network. During the development of the 3rd Uniform California Earthquake Rupture Forecast (UCERF3), the prevalence of multi-fault ruptures in the modeling was controversial. Yet recent earthquakes, for example, the Kaikora earthquake - as well as new research on the potential of multi-fault ruptures (e.g., Nissen et al., 2016; Sahakian et al. 2017) - have validated this approach. For large crustal earthquakes, multi-fault ruptures may be the norm rather than the exception. As datasets improve and we can view the rupture process at a finer scale, the interconnected, fractal nature of faults is revealed even by individual earthquakes. What is the proper way to model earthquakes on a fractal fault network? We show multiple lines of evidence that connectivity even in modern models such as UCERF3 may be underestimated, although clustering in UCERF3 mitigates some modeling simplifications. We need a methodology that can be applied equally well where the fault network is well-mapped and where it is not - an extendable methodology that allows us to "fill in" gaps in the fault network and in our knowledge.

  12. Characterization of the Virginia earthquake effects and source parameters from website traffic analysis

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Lefebvre, S.; Mazet-Roux, G.; Roussel, F.

    2012-12-01

    This paper presents an after the fact study of the Virginia earthquake of 2011 August 23 using only the traffic observed on the EMSC website within minutes of its occurrence. Although the EMSC real time information services remain poorly identified in the US, a traffic surge was observed immediately after the earthquake's occurrence. Such surges, known as flashcrowd and commonly observed on our website after felt events within the Euro-Med region are caused by eyewitnesses looking for information about the shaking they have just felt. EMSC developed an approach named flashsourcing to map the felt area, and in some circumstances, the regions affected by severe damage or network disruption. The felt area is mapped simply by locating the Internet Protocol (IP) addresses of the visitors to the website during these surges while the existence of network disruption is detected by the instantaneous loss at the time of earthquake's occurrence of existing Internet sessions originating from the impacted area. For the Virginia earthquake, which was felt at large distances, the effects of the waves propagation are clearly observed. We show that the visits to our website are triggered by the P waves arrival: the first visitors from a given locality reach our website 90s after their location was shaken by the P waves. From a processing point of view, eyewitnesses can then be considered as ground motion detectors. By doing so, the epicentral location is determined through a simple dedicated location algorithm within 2 min of the earthquake's occurrence and 30 km accuracy. The magnitude can be estimated in similar time frame by using existing empirical relationships between the surface of the felt area and the magnitude. Concerning the effects of the earthquake, we check whether one can discriminate localities affected by strong shaking from web traffic analysis. This is actually the case. Localities affected by strong level of shaking exhibit higher ratio of visitors to the number

  13. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  14. Engineering uses of physics-based ground motion simulations

    USGS Publications Warehouse

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  15. Multi-instrument observations of pre-earthquake transient signatures associated with 2015 M8.3 Chile earthquake

    NASA Astrophysics Data System (ADS)

    Ouzounov, D.; Pulinets, S. A.; Hernandez-Pajares, M.; Garcia-Rigo, A.; De Santis, A.; Pavón, J.; Liu, J. Y. G.; Chen, C. H.; Cheng, K. C.; Hattori, K.; Stepanova, M. V.; Romanova, N.; Hatzopoulos, N.; Kafatos, M.

    2016-12-01

    We are conducting multi parameter validation study on lithosphere/atmosphere /ionosphere transient phenomena preceding major earthquakes particularly for the case of M8.3 of Sept 16th, 2015 in Chile. Our approach is based on monitoring simultaneously a series of different physical parameters from space: 1/Outgoing long-wavelength radiation (OLR obtained from NOAA/AVHRR); 2/ electron and electron density variations in the ionosphere via GPS Total Electron Content (GPS/TEC), and 3/geomagnetic field and plasma density variation (Swarm); and from ground: 3/ GPS crustal deformation and 4/ground-based magnetometers. The time and location of main shock was prospectively alerted in advance using the Multi Sensor Networking Approach (MSNA-LAIC) approach. We analyzed retrospectively several physical observations characterizing the state of the lithosphere, atmosphere and ionosphere several days before, during and after the M8.3 earthquakes in Illapel. Our continuous satellite monitoring of long-wave (LW) data over Chile, shows a rapid increase of emitted radiation during the end of August 2015 and an anomaly in the atmosphere was detected at 19 LT on Sept 1st, 2015, over the water near to the epicenter. On Sept 2nd Swarm magnetic measurements show an anomalous signature over the epicentral region. GPS/TEC analysis revealed an anomaly on Sept 14th and on the same day the degradation of Equatorial Ionospheric Anomaly (EIA) and disappearance of the crests of EIA as is characteristic for pre-dawn and early morning hours (11 LT) was observed. On Sept 16th co-seismic ionospheric signatures consistent with defined circular acoustic-gravity wave and different shock-acoustic waves were also observed. GPS TEC and deformation studies were computed from 48 GPS stations (2013-2015) of National Seismological Center of Chile (CSN) GPS network. A transient signal of deformation has been observed a week in advance correlated with ground-based magnetometers ULF signal fluctuation from closest

  16. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    1906? Since the number of M>7.0 aftershocks has been low, does the distribution of large-magnitude aftershocks differ from previous events of this size? What is the origin of the extensional-type aftershocks at shallow depths within the upper plate? The international seismological community (France, Germany, U.K., U.S.A.) in collaboration with the Chilean seismological community responded with a total of 140 portable seismic stations to deploy in order to record aftershocks. This combined with the Chilean permanent seismic network, in the area results in 180 stations now in operation recording continuous at 100 cps. The seismic equipment is a mix of accelerometers, short -period and broadband seismic sensors deployed along the entire length of the aftershock zone that will record the aftershock sequence for three to six months. The collected seismic data will be merged and archived to produce an international data set open to the entire seismological community immediately after archiving. Each international group will submit their data as soon as possible in standard (mini seed) format with accompanying meta data to the IRIS DMC where the data will be merged into a combined data set and available to individuals and other data centers. This will be by far the best-recorded aftershock sequence of a large megathrust earthquake. This outstanding international collaboration will provide an open data set for this important earthquake as well as provide a model for future aftershock deployments around the world.

  17. Quantifying 10 years of Improvements in Earthquake and Tsunami Monitoring in the Caribbean and Adjacent Regions

    NASA Astrophysics Data System (ADS)

    von Hillebrandt-Andrade, C.; Huerfano Moreno, V. A.; McNamara, D. E.; Saurel, J. M.

    2014-12-01

    The magnitude-9.3 Sumatra-Andaman Islands earthquake of December 26, 2004, increased global awareness to the destructive hazard of earthquakes and tsunamis. Post event assessments of global coastline vulnerability highlighted the Caribbean as a region of high hazard and risk and that it was poorly monitored. Nearly 100 tsunamis have been reported for the Caribbean region and Adjacent Regions in the past 500 years and continue to pose a threat for its nations, coastal areas along the Gulf of Mexico, and the Atlantic seaboard of North and South America. Significant efforts to improve monitoring capabilities have been undertaken since this time including an expansion of the United States Geological Survey (USGS) Global Seismographic Network (GSN) (McNamara et al., 2006) and establishment of the United Nations Educational, Scientific and Cultural Organization (UNESCO) Intergovernmental Coordination Group (ICG) for the Tsunami and other Coastal Hazards Warning System for the Caribbean and Adjacent Regions (CARIBE EWS). The minimum performance standards it recommended for initial earthquake locations include: 1) Earthquake detection within 1 minute, 2) Minimum magnitude threshold = M4.5, and 3) Initial hypocenter error of <30 km. In this study, we assess current compliance with performance standards and model improvements in earthquake and tsunami monitoring capabilities in the Caribbean region since the first meeting of the UNESCO ICG-Caribe EWS in 2006. The three measures of network capability modeled in this study are: 1) minimum Mw detection threshold; 2) P-wave detection time of an automatic processing system and; 3) theoretical earthquake location uncertainty. By modeling three measures of seismic network capability, we can optimize the distribution of ICG-Caribe EWS seismic stations and select an international network that will be contributed from existing real-time broadband national networks in the region. Sea level monitoring improvements both offshore and

  18. Dehydration-driven stress transfer triggers intermediate-depth earthquakes

    DOE PAGES

    Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah; ...

    2017-05-15

    Intermediate-depth earthquakes (30–300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here in this paper we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearingmore » micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediatedepth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement.« less

  19. Dehydration-driven stress transfer triggers intermediate-depth earthquakes

    PubMed Central

    Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah; Deldicque, Damien; Labrousse, Loïc; Gasc, Julien; Renner, Joerg; Wang, Yanbin; Green II, Harry W.; Schubnel, Alexandre

    2017-01-01

    Intermediate-depth earthquakes (30–300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearing micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediate-depth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement. PMID:28504263

  20. Dehydration-driven stress transfer triggers intermediate-depth earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah

    Intermediate-depth earthquakes (30–300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here in this paper we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearingmore » micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediatedepth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement.« less

  1. Dehydration-driven stress transfer triggers intermediate-depth earthquakes

    NASA Astrophysics Data System (ADS)

    Ferrand, Thomas P.; Hilairet, Nadège; Incel, Sarah; Deldicque, Damien; Labrousse, Loïc; Gasc, Julien; Renner, Joerg; Wang, Yanbin; Green, Harry W., II; Schubnel, Alexandre

    2017-05-01

    Intermediate-depth earthquakes (30-300 km) have been extensively documented within subducting oceanic slabs, but their mechanics remains enigmatic. Here we decipher the mechanism of these earthquakes by performing deformation experiments on dehydrating serpentinized peridotites (synthetic antigorite-olivine aggregates, minerals representative of subduction zones lithologies) at upper mantle conditions. At a pressure of 1.1 gigapascals, dehydration of deforming samples containing only 5 vol% of antigorite suffices to trigger acoustic emissions, a laboratory-scale analogue of earthquakes. At 3.5 gigapascals, acoustic emissions are recorded from samples with up to 50 vol% of antigorite. Experimentally produced faults, observed post-mortem, are sealed by fluid-bearing micro-pseudotachylytes. Microstructural observations demonstrate that antigorite dehydration triggered dynamic shear failure of the olivine load-bearing network. These laboratory analogues of intermediate-depth earthquakes demonstrate that little dehydration is required to trigger embrittlement. We propose an alternative model to dehydration-embrittlement in which dehydration-driven stress transfer, rather than fluid overpressure, causes embrittlement.

  2. Geotechnical effects of the 2015 magnitude 7.8 Gorkha, Nepal, earthquake and aftershocks

    USGS Publications Warehouse

    Moss, Robb E. S.; Thompson, Eric M.; Kieffer, D Scott; Tiwari, Binod; Hashash, Youssef M A; Acharya, Indra; Adhikari, Basanta; Asimaki, Domniki; Clahan, Kevin B.; Collins, Brian D.; Dahal, Sachindra; Jibson, Randall W.; Khadka, Diwakar; Macdonald, Amy; Madugo, Chris L M; Mason, H Benjamin; Pehlivan, Menzer; Rayamajhi, Deepak; Uprety, Sital

    2015-01-01

    This article summarizes the geotechnical effects of the 25 April 2015 M 7.8 Gorkha, Nepal, earthquake and aftershocks, as documented by a reconnaissance team that undertook a broad engineering and scientific assessment of the damage and collected perishable data for future analysis. Brief descriptions are provided of ground shaking, surface fault rupture, landsliding, soil failure, and infrastructure performance. The goal of this reconnaissance effort, led by Geotechnical Extreme Events Reconnaissance, is to learn from earthquakes and mitigate hazards in future earthquakes.

  3. Methodologies for the assessment of earthquake-triggered landslides hazard. A comparison of Logistic Regression and Artificial Neural Network models.

    NASA Astrophysics Data System (ADS)

    García-Rodríguez, M. J.; Malpica, J. A.; Benito, B.

    2009-04-01

    In recent years, interest in landslide hazard assessment studies has increased substantially. They are appropriate for evaluation and mitigation plan development in landslide-prone areas. There are several techniques available for landslide hazard research at a regional scale. Generally, they can be classified in two groups: qualitative and quantitative methods. Most of qualitative methods tend to be subjective, since they depend on expert opinions and represent hazard levels in descriptive terms. On the other hand, quantitative methods are objective and they are commonly used due to the correlation between the instability factors and the location of the landslides. Within this group, statistical approaches and new heuristic techniques based on artificial intelligence (artificial neural network (ANN), fuzzy logic, etc.) provide rigorous analysis to assess landslide hazard over large regions. However, they depend on qualitative and quantitative data, scale, types of movements and characteristic factors used. We analysed and compared an approach for assessing earthquake-triggered landslides hazard using logistic regression (LR) and artificial neural networks (ANN) with a back-propagation learning algorithm. One application has been developed in El Salvador, a country of Central America where the earthquake-triggered landslides are usual phenomena. In a first phase, we analysed the susceptibility and hazard associated to the seismic scenario of the 2001 January 13th earthquake. We calibrated the models using data from the landslide inventory for this scenario. These analyses require input variables representing physical parameters to contribute to the initiation of slope instability, for example, slope gradient, elevation, aspect, mean annual precipitation, lithology, land use, and terrain roughness, while the occurrence or non-occurrence of landslides is considered as dependent variable. The results of the landslide susceptibility analysis are checked using landslide

  4. The Self-Organising Seismic Early Warning Information Network: Scenarios

    NASA Astrophysics Data System (ADS)

    Kühnlenz, F.; Fischer, J.; Eveslage, I.

    2009-04-01

    to do this job in a shorter time and with less manpower compared to using common seismic stations. We present here the graphical front-end of SOSEWIN in its usage for different scenarios. It belongs to a management infrastructure based on GIS and database technologies and therefore coupling with existing infrastructures should be simplified. Connecting the domain expert's laptop running the management software with a SOSEWIN may be fulfilled via any arbitrary node in the network (on-site access) or via a gateway node from a remote location using the internet. The scenarios focus on the needs of certain domain experts (seismologists or maybe engineers) and include the planning of a network installation, support during the installation process and testing of this installation. Another scenario mentions monitoring aspects of an already installed network and finally a scenario deals with the visualization of the alarming protocol detecting an earthquake event and issuing an early warning.

  5. Hybrid wireless sensor network for rescue site monitoring after earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Rui; Wang, Shuo; Tang, Chong; Zhao, Xiaoguang; Hu, Weijian; Tan, Min; Gao, Bowei

    2016-07-01

    This paper addresses the design of a low-cost, low-complexity, and rapidly deployable wireless sensor network (WSN) for rescue site monitoring after earthquakes. The system structure of the hybrid WSN is described. Specifically, the proposed hybrid WSN consists of two kinds of wireless nodes, i.e., the monitor node and the sensor node. Then the mechanism and the system configuration of the wireless nodes are detailed. A transmission control protocol (TCP)-based request-response scheme is proposed to allow several monitor nodes to communicate with the monitoring center. UDP-based image transmission algorithms with fast recovery have been developed to meet the requirements of in-time delivery of on-site monitor images. In addition, the monitor node contains a ZigBee module that used to communicate with the sensor nodes, which are designed with small dimensions to monitor the environment by sensing different physical properties in narrow spaces. By building a WSN using these wireless nodes, the monitoring center can display real-time monitor images of the monitoring area and visualize all collected sensor data on geographic information systems. In the end, field experiments were performed at the Training Base of Emergency Seismic Rescue Troops of China and the experimental results demonstrate the feasibility and effectiveness of the monitor system.

  6. Options for developing modernized geodetic datum for Nepal following the April 25, 2015 Mw7.8 Gorkha earthquake

    NASA Astrophysics Data System (ADS)

    Pearson, Chris; Manandhar, Niraj; Denys, Paul

    2017-09-01

    Along with the damage to buildings and infrastructure, the April 25, 2015 Mw7.8 Gorkha earthquake caused significant deformation over a large area of eastern Nepal with displacements of over 2 m recorded in the vicinity of Kathmandu. Nepal currently uses a classical datum developed in 1984 by the Royal (UK) Engineers in collaboration with the Nepal Survey Department. It has served Nepal well; however, the recent earthquakes have provided an impetus for developing a semi-dynamic datum that will be based on ITRF2014 and have the capacity to correct for tectonic deformation. In the scenario we present here, the datum would be based on ITRF2014 with a reference epoch set some time after the end of the current sequence of earthquakes. The deformation model contains a grid of the secular velocity field combined with models of the Gorkha Earthquake and the May 12 Mw7.3 aftershock. We have developed a preliminary velocity field by collating GPS derived crustal velocities from four previous studies for Nepal and adjacent parts of China and India and aligning them to the ITRF. Patches for the co-seismic part of the deformation for the Gorkha earthquake and the May 12, 2015 Mw 7.2 aftershock are based on published dislocation models. High order control would be a CORS network based around the existing Nepal GPS Array. Coordinates for existing lower order control would be determined by readjusting existing survey measurements and these would be combined with a series of new control stations spread throughout Nepal.

  7. The Lushan earthquake and the giant panda: impacts and conservation.

    PubMed

    Zhang, Zejun; Yuan, Shibin; Qi, Dunwu; Zhang, Mingchun

    2014-06-01

    Earthquakes not only result in a great loss of human life and property, but also have profound effects on the Earth's biodiversity. The Lushan earthquake occurred on 20 Apr 2013, with a magnitude of 7.0 and an intensity of 9.0 degrees. A distance of 17.0 km from its epicenter to the nearest distribution site of giant pandas recorded in the Third National Survey was determined. Making use of research on the Wenchuan earthquake (with a magnitude of 8.0), which occurred approximately 5 years ago, we briefly analyze the impacts of the Lushan earthquake on giant pandas and their habitat. An earthquake may interrupt ongoing behaviors of giant pandas and may also cause injury or death. In addition, an earthquake can damage conservation facilities for pandas, and result in further habitat fragmentation and degradation. However, from a historical point of view, the impacts of human activities on giant pandas and their habitat may, in fact, far outweigh those of natural disasters such as earthquakes. Measures taken to promote habitat restoration and conservation network reconstruction in earthquake-affected areas should be based on requirements of giant pandas, not those of humans. © 2013 International Society of Zoological Sciences, Institute of Zoology/Chinese Academy of Sciences and Wiley Publishing Asia Pty Ltd.

  8. Estimating the Maximum Magnitude of Induced Earthquakes With Dynamic Rupture Simulations

    NASA Astrophysics Data System (ADS)

    Gilmour, E.; Daub, E. G.

    2017-12-01

    Seismicity in Oklahoma has been sharply increasing as the result of wastewater injection. The earthquakes, thought to be induced from changes in pore pressure due to fluid injection, nucleate along existing faults. Induced earthquakes currently dominate central and eastern United States seismicity (Keranen et al. 2016). Induced earthquakes have only been occurring in the central US for a short time; therefore, too few induced earthquakes have been observed in this region to know their maximum magnitude. The lack of knowledge regarding the maximum magnitude of induced earthquakes means that large uncertainties exist in the seismic hazard for the central United States. While induced earthquakes follow the Gutenberg-Richter relation (van der Elst et al. 2016), it is unclear if there are limits to their magnitudes. An estimate of the maximum magnitude of the induced earthquakes is crucial for understanding their impact on seismic hazard. While other estimates of the maximum magnitude exist, those estimates are observational or statistical, and cannot take into account the possibility of larger events that have not yet been observed. Here, we take a physical approach to studying the maximum magnitude based on dynamic ruptures simulations. We run a suite of two-dimensional ruptures simulations to physically determine how ruptures propagate. The simulations use the known parameters of principle stress orientation and rupture locations. We vary the other unknown parameters of the ruptures simulations to obtain a large number of rupture simulation results reflecting different possible sets of parameters, and use these results to train a neural network to complete the ruptures simulations. Then using a Markov Chain Monte Carlo method to check different combinations of parameters, the trained neural network is used to create synthetic magnitude-frequency distributions to compare to the real earthquake catalog. This method allows us to find sets of parameters that are

  9. Engine cylinder pressure reconstruction using crank kinematics and recurrently-trained neural networks

    NASA Astrophysics Data System (ADS)

    Bennett, C.; Dunne, J. F.; Trimby, S.; Richardson, D.

    2017-02-01

    A recurrent non-linear autoregressive with exogenous input (NARX) neural network is proposed, and a suitable fully-recurrent training methodology is adapted and tuned, for reconstructing cylinder pressure in multi-cylinder IC engines using measured crank kinematics. This type of indirect sensing is important for cost effective closed-loop combustion control and for On-Board Diagnostics. The challenge addressed is to accurately predict cylinder pressure traces within the cycle under generalisation conditions: i.e. using data not previously seen by the network during training. This involves direct construction and calibration of a suitable inverse crank dynamic model, which owing to singular behaviour at top-dead-centre (TDC), has proved difficult via physical model construction, calibration, and inversion. The NARX architecture is specialised and adapted to cylinder pressure reconstruction, using a fully-recurrent training methodology which is needed because the alternatives are too slow and unreliable for practical network training on production engines. The fully-recurrent Robust Adaptive Gradient Descent (RAGD) algorithm, is tuned initially using synthesised crank kinematics, and then tested on real engine data to assess the reconstruction capability. Real data is obtained from a 1.125 l, 3-cylinder, in-line, direct injection spark ignition (DISI) engine involving synchronised measurements of crank kinematics and cylinder pressure across a range of steady-state speed and load conditions. The paper shows that a RAGD-trained NARX network using both crank velocity and crank acceleration as input information, provides fast and robust training. By using the optimum epoch identified during RAGD training, acceptably accurate cylinder pressures, and especially accurate location-of-peak-pressure, can be reconstructed robustly under generalisation conditions, making it the most practical NARX configuration and recurrent training methodology for use on production engines.

  10. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  11. Detection and Mapping of the September 2017 Mexico Earthquakes Using DAS Fiber-Optic Infrastructure Arrays

    NASA Astrophysics Data System (ADS)

    Karrenbach, M. H.; Cole, S.; Williams, J. J.; Biondi, B. C.; McMurtry, T.; Martin, E. R.; Yuan, S.

    2017-12-01

    Fiber-optic distributed acoustic sensing (DAS) uses conventional telecom fibers for a wide variety of monitoring purposes. Fiber-optic arrays can be located along pipelines for leak detection; along borders and perimeters to detect and locate intruders, or along railways and roadways to monitor traffic and identify and manage incidents. DAS can also be used to monitor oil and gas reservoirs and to detect earthquakes. Because thousands of such arrays are deployed worldwide and acquiring data continuously, they can be a valuable source of data for earthquake detection and location, and could potentially provide important information to earthquake early-warning systems. In this presentation, we show that DAS arrays in Mexico and the United States detected the M8.1 and M7.2 Mexico earthquakes in September 2017. At Stanford University, we have deployed a 2.4 km fiber-optic DAS array in a figure-eight pattern, with 600 channels spaced 4 meters apart. Data have been recorded continuously since September 2016. Over 800 earthquakes from across California have been detected and catalogued. Distant teleseismic events have also been recorded, including the two Mexican earthquakes. In Mexico, fiber-optic arrays attached to pipelines also detected these two events. Because of the length of these arrays and their proximity to the event locations, we can not only detect the earthquakes but also make location estimates, potentially in near real time. In this presentation, we review the data recorded for these two events recorded at Stanford and in Mexico. We compare the waveforms recorded by the DAS arrays to those recorded by traditional earthquake sensor networks. Using the wide coverage provided by the pipeline arrays, we estimate the event locations. Such fiber-optic DAS networks can potentially play a role in earthquake early-warning systems, allowing actions to be taken to minimize the impact of an earthquake on critical infrastructure components. While many such fiber

  12. Effect of baseline corrections on response spectra for two recordings of the 1999 Chi-Chi, Taiwan, earthquake

    USGS Publications Warehouse

    Boore, David M.

    1999-01-01

    Displacements derived from the accelerogram recordings of the 1999 Chi-Chi, Taiwan earthquake at stations TCU078 and TCU129 show drifts when only a simple baseline derived from the pre-event portion of the record is removed from the records. The appearance of the velocity and displacement records suggests that changes in the zero-level of the acceleration are responsible for these drifts. The source of the shifts in zero-level are unknown, but might include tilts in the instruments or the response of the instruments to strong shaking. This note illustrates the effect on the velocity, displacement, and response spectra of several schemes for accounting for these baseline shifts. The most important conclusion for earthquake engineering purposes is that the response spectra for periods less than about 20 sec are unaffected by the baseline correction. The results suggest, however, that staticdisplac ements estimated from the instruments should be used with caution. Although limited to the analysis of only two recordings, the results may have more general significance both for the many other recordings of this earthquake and for data that will be obtained in the future from similar high-quality accelerograph networks now being installed or soon to be installed in many parts of the world.

  13. Large Earthquakes Disrupt Groundwater System by Breaching Aquitards

    NASA Astrophysics Data System (ADS)

    Wang, C. Y.; Manga, M.; Liao, X.; Wang, L. P.

    2016-12-01

    Changes of groundwater system by large earthquakes are widely recognized. Some changes have been attributed to increases in the vertical permeability but basic questions remain: How do increases in the vertical permeability occur? How frequent do they occur? How fast does the vertical permeability recover after the earthquake? Is there a quantitative measure for detecting the occurrence of aquitard breaching? Here we attempt to answer these questions by examining data accumulated in the past 15 years. Analyses of increased stream discharges and their geochemistry after large earthquakes show evidence that the excess water originates from groundwater released from high elevations by large increase of the vertical permeability. Water-level data from a dense network of clustered wells in a sedimentary basin near the epicenter of the 1999 M7.6 Chi-Chi earthquake in western Taiwan show that, while most confined aquifers remained confined after the earthquake, about 10% of the clustered wells show evidence of coseismic breaching of aquitards and a great increase of the vertical permeability. Water level in wells without evidence of coseismic breaching of aquitards show similar tidal response before and after the earthquake; wells with evidence of coseismic breaching of aquitards, on the other hand, show distinctly different tidal response before and after the earthquake and that the aquifers became hydraulically connected for many months thereafter. Breaching of aquitards by large earthquakes has significant implications for a number of societal issues such as the safety of water resources, the security of underground waste repositories, and the production of oil and gas. The method demonstrated here may be used for detecting the occurrence of aquitard breaching by large earthquakes in other seismically active areas.

  14. Neural Network and Response Surface Methodology for Rocket Engine Component Optimization

    NASA Technical Reports Server (NTRS)

    Vaidyanathan, Rajkumar; Papita, Nilay; Shyy, Wei; Tucker, P. Kevin; Griffin, Lisa W.; Haftka, Raphael; Fitz-Coy, Norman; McConnaughey, Helen (Technical Monitor)

    2000-01-01

    The goal of this work is to compare the performance of response surface methodology (RSM) and two types of neural networks (NN) to aid preliminary design of two rocket engine components. A data set of 45 training points and 20 test points obtained from a semi-empirical model based on three design variables is used for a shear coaxial injector element. Data for supersonic turbine design is based on six design variables, 76 training, data and 18 test data obtained from simplified aerodynamic analysis. Several RS and NN are first constructed using the training data. The test data are then employed to select the best RS or NN. Quadratic and cubic response surfaces. radial basis neural network (RBNN) and back-propagation neural network (BPNN) are compared. Two-layered RBNN are generated using two different training algorithms, namely solverbe and solverb. A two layered BPNN is generated with Tan-Sigmoid transfer function. Various issues related to the training of the neural networks are addressed including number of neurons, error goals, spread constants and the accuracy of different models in representing the design space. A search for the optimum design is carried out using a standard gradient-based optimization algorithm over the response surfaces represented by the polynomials and trained neural networks. Usually a cubic polynominal performs better than the quadratic polynomial but exceptions have been noticed. Among the NN choices, the RBNN designed using solverb yields more consistent performance for both engine components considered. The training of RBNN is easier as it requires linear regression. This coupled with the consistency in performance promise the possibility of it being used as an optimization strategy for engineering design problems.

  15. The Red Atrapa Sismos (Quake Catcher Network in Mexico): assessing performance during large and damaging earthquakes.

    USGS Publications Warehouse

    Dominguez, Luis A.; Yildirim, Battalgazi; Husker, Allen L.; Cochran, Elizabeth S.; Christensen, Carl; Cruz-Atienza, Victor M.

    2015-01-01

    Each volunteer computer monitors ground motion and communicates using the Berkeley Open Infrastructure for Network Computing (BOINC, Anderson, 2004). Using a standard short‐term average, long‐term average (STLA) algorithm (Earle and Shearer, 1994; Cochran, Lawrence, Christensen, Chung, 2009; Cochran, Lawrence, Christensen, and Jakka, 2009), volunteer computer and sensor systems detect abrupt changes in the acceleration recordings. Each time a possible trigger signal is declared, a small package of information containing sensor and ground‐motion information is streamed to one of the QCN servers (Chung et al., 2011). Trigger signals, correlated in space and time, are then processed by the QCN server to look for potential earthquakes.

  16. The Puerto Rico Seismic Network Broadcast System: A user friendly GUI to broadcast earthquake messages, to generate shakemaps and to update catalogues

    NASA Astrophysics Data System (ADS)

    Velez, J.; Huerfano, V.; von Hillebrandt, C.

    2007-12-01

    The Puerto Rico Seismic Network (PRSN) has historically provided locations and magnitudes for earthquakes in the Puerto Rico and Virgin Islands (PRVI) region. PRSN is the reporting authority for the region bounded by latitudes 17.0N to 20.0N, and longitudes 63.5W to 69.0W. The main objective of the PRSN is to record, process, analyze, provide information and research local, regional and teleseismic earthquakes, providing high quality data and information to be able to respond to the needs of the emergency management, academic and research communities, and the general public. The PRSN runs Earthworm software (Johnson et al, 1995) to acquire and write waveforms to disk for permanent archival. Automatic locations and alerts are generated for events in Puerto Rico, the Intra America Seas, and the Atlantic by the EarlyBird system (Whitmore and Sokolowski, 2002), which monitors PRSN stations as well as some 40 additional stations run by networks operating in North, Central and South America and other sites in the Caribbean. PRDANIS (Puerto Rico Data Analysis and Information System) software, developed by PRSN, supports manual locations and analyst review of automatic locations of events within the PRSN area of responsibility (AOR), using all the broadband, strong-motion and short-period waveforms Rapidly available information regarding the geographic distribution of ground shaking in relation to the population and infrastructure at risk can assist emergency response communities in efficient and optimized allocation of resources following a large earthquake. The ShakeMap system developed by the USGS provides near real-time maps of instrumental ground motions and shaking intensity and has proven effective in rapid assessment of the extent of shaking and potential damage after significant earthquakes (Wald, 2004). In Northern and Southern California, the Pacific Northwest, and the states of Utah and Nevada, ShakeMaps are used for emergency planning and response, loss

  17. A strain behavior before and after the 2009 Suruga-Bay earthquake (M6.5) in Tokai, Japan

    NASA Astrophysics Data System (ADS)

    Takanami, T.; Hirata, N.; Kitagawa, G.; Kamigaichi, O.; Linde, A. T.; Sacks, S. I.

    2012-12-01

    On 11 August 2009 the intraslab earthquake (M6.5) struck the Tokai area. The largest intensity observed was VI-in JMA scale, and it was a felt earthquake in a wide area including the Kanto and Koshin'estu Regions. Tsunamis were observed at and around the Suruga Bay. In the Tokai area, the Japan Meteorological Agency (JMA) continuously monitors strain data by the real time automated processing in the Tokai network. According to JMA, it is unconnected to the anticipated Tokai Earthquake (M8) judging from the acceptable reasons. For instance, it is an intraslab earthquake in the Philippine Sea plate, while the anticipated earthquake is a plate boundary earthquake on the upper side of the Philippine Sea plate. We consider it as an appropriate earthquake for validation of the Tokai network, though the feature of earthquake is different from one of the anticipated earthquake. We here tried to investigate the strain behavior before and after the 2009 Suruga Bay earthquake occurred in the fault zone of the anticipated Tokai earthquake. In actual, the Tokai network of strainmeters has been monitoring the short-term slow slip events (SSE) synchronized with nearby low frequency earthquakes or tremors since 2005 (Kobayashi et al., 2006). However, the earth's surface is always under the continuous influence of a variety of natural forces such as earthquakes, wave, wind, tide, air pressure, precipitation and a variety of human induced sources, which create noise when monitoring geodetic strain. Eliminating these noise inputs from the raw strain data requires proper statistical modeling, for automatic processing of geodetic strain data. It is desirable to apply the state space method to noisy Tokai strain data in order to detect precursors of the anticipated Tokai earthquake. The method is based on the general state space method, recursive filtering and smoothing algorithms (Kitagawa and Matsumoto, 1996). The first attempt to apply this method to actual strain data was made using

  18. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  19. Data Delivery Latency Improvements And First Steps Towards The Distributed Computing Of The Caltech/USGS Southern California Seismic Network Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Stubailo, I.; Watkins, M.; Devora, A.; Bhadha, R. J.; Hauksson, E.; Thomas, V. I.

    2016-12-01

    The USGS/Caltech Southern California Seismic Network (SCSN) is a modern digital ground motion seismic network. It develops and maintains Earthquake Early Warning (EEW) data collection and delivery systems in southern California as well as real-time EEW algorithms. Recently, Behr et al., SRL, 2016 analyzed data from several regional seismic networks deployed around the globe. They showed that the SCSN was the network with the smallest data communication delays or latency. Since then, we have reduced further the telemetry delays for many of the 330 current sites. The latency has been reduced on average from 2-6 sec to 0.4 seconds by tuning the datalogger parameters and/or deploying software upgrades. Recognizing the latency data as one of the crucial parameters in EEW, we have started archiving the per-packet latencies in mseed format for all the participating sites in a similar way it is traditionally done for the seismic waveform data. The archived latency values enable us to understand and document long-term changes in performance of the telemetry links. We can also retroactively investigate how latent the waveform data were during a specific event or during a specific time period. In addition the near-real time latency values are useful for monitoring and displaying the real-time station latency, in particular to compare different telemetry technologies. A future step to reduce the latency is to deploy the algorithms on the dataloggers at the seismic stations and transmit either the final solutions or intermediate parameters to a central processing center. To implement this approach, we are developing a stand-alone version of the OnSite algorithm to run on the dataloggers in the field. This will increase the resiliency of the SCSN to potential telemetry restrictions in the immediate aftermath of a large earthquake, either by allowing local alarming by the single station, or permitting transmission of lightweight parametric information rather than continuous

  20. The Key Role of Eyewitnesses in Rapid Impact Assessment of Global Earthquake

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Steed, R.; Mazet-Roux, G.; Roussel, F.; Etivant, C.; Frobert, L.; Godey, S.

    2014-12-01

    Uncertainties in rapid impact assessments of global earthquakes are intrinsically large because they rely on 3 main elements (ground motion prediction models, building stock inventory and related vulnerability) which values and/or spatial variations are poorly constrained. Furthermore, variations of hypocentral location and magnitude within their respective uncertainty domain can lead to significantly different shaking level for centers of population and change the scope of the disaster. We present the strategy and methods implemented at the Euro-Med Seismological Centre (EMSC) to rapidly collect in-situ observations on earthquake effects from eyewitnesses for reducing uncertainties of rapid earthquake impact assessment. It comprises crowdsourced information (online questionnaires, pics) as well as information derived from real time analysis of web traffic (flashourcing technique), and more recently deployment of QCN (Quake Catcher Network) low cost sensors. We underline the importance of merging results of different methods to improve performances and reliability of collected data.We try to better understand and respond to public demands and expectations after earthquakes through improved information services and diversification of information tools (social networks, smartphone app., browsers adds-on…), which, in turn, drive more eyewitnesses to our services and improve data collection. We will notably present our LastQuake Twitter feed (Quakebot) and smartphone applications (IOs and android) which only report earthquakes that matter for the public and authorities, i.e. felt and damaging earthquakes identified thanks to citizen generated information.

  1. Earthquake Preparedness and Education: A Collective Impact Approach to Improving Awareness and Resiliency

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.; Wood, M. M.; Ballmann, J. E.; DeGroot, R. M.

    2017-12-01

    The Southern California Earthquake Center (SCEC), headquartered at the University of Southern California, is a collaboration of more than 1000 scientists and students from 70+ institutions. SCEC's Communication, Education, and Outreach (CEO) program translates earthquake science into products and activities in order to increase scientific literacy, develop a diverse scientific workforce, and reduce earthquake risk to life and property. SCEC CEO staff coordinate these efforts through partnership collaborations it has established to engage subject matter experts, reduce duplication of effort, and achieve greater results. Several of SCEC's collaborative networks began within Southern California and have since grown statewide (Earthquake Country Alliance, a public-private-grassroots partnership), national ("EPIcenter" Network of museums, parks, libraries, etc.), and international (Great ShakeOut Earthquake Drills with millions of participants each year). These networks have benefitted greatly from partnerships with national (FEMA), state, and local emergency managers. Other activities leverage SCEC's networks in new ways and with national earth science organizations, such as the EarthConnections Program (with IRIS, NAGT, and many others), Quake Catcher Network (with IRIS) and the GeoHazards Messaging Collaboratory (with IRIS, UNAVCO, and USGS). Each of these partnerships share a commitment to service, collaborative development, and the application of research (including social science theory for motivating preparedness behaviors). SCEC CEO is developing new evaluative structures and adapting the Collective Impact framework to better understand what has worked well or what can be improved, according to the framework's five key elements: create a common agenda; share common indicators and measurement; engage diverse stakeholders to coordinate mutually reinforcing activities; initiate continuous communication; and provide "backbone" support. This presentation will provide

  2. Using Long-Short-Term-Memory Recurrent Neural Networks to Predict Aviation Engine Vibrations

    NASA Astrophysics Data System (ADS)

    ElSaid, AbdElRahman Ahmed

    This thesis examines building viable Recurrent Neural Networks (RNN) using Long Short Term Memory (LSTM) neurons to predict aircraft engine vibrations. The different networks are trained on a large database of flight data records obtained from an airline containing flights that suffered from excessive vibration. RNNs can provide a more generalizable and robust method for prediction over analytical calculations of engine vibration, as analytical calculations must be solved iteratively based on specific empirical engine parameters, and this database contains multiple types of engines. Further, LSTM RNNs provide a "memory" of the contribution of previous time series data which can further improve predictions of future vibration values. LSTM RNNs were used over traditional RNNs, as those suffer from vanishing/exploding gradients when trained with back propagation. The study managed to predict vibration values for 1, 5, 10, and 20 seconds in the future, with 2.84% 3.3%, 5.51% and 10.19% mean absolute error, respectively. These neural networks provide a promising means for the future development of warning systems so that suitable actions can be taken before the occurrence of excess vibration to avoid unfavorable situations during flight.

  3. Earthquake detection through computationally efficient similarity search

    PubMed Central

    Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.

    2015-01-01

    Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176

  4. Seismomagnetic effects from the long-awaited 28 September 2004 M 6.0 parkfield earthquake

    USGS Publications Warehouse

    Johnston, M.J.S.; Sasai, Y.; Egbert, G.D.; Mueller, R.J.

    2006-01-01

    Precise measurements of local magnetic fields have been obtained with a differentially connected array of seven synchronized proton magnetometers located along 60 km of the locked-to-creeping transition region of the San Andreas fault at Parkfield, California, since 1976. The M 6.0 Parkfield earthquake on 28 September 2004, occurred within this array and generated coseismic magnetic field changes of between 0.2 and 0.5 nT at five sites in the network. No preseismic magnetic field changes exceeding background noise levels are apparent in the magnetic data during the month, week, and days before the earthquake (or expected in light of the absence of measurable precursive deformation, seismicity, or pore pressure changes). Observations of electric and magnetic fields from 0.01 to 20 Hz are also made at one site near the end of the earthquake rupture and corrected for common-mode signals from the ionosphere/magnetosphere using a second site some 115 km to the northwest along the fault. These magnetic data show no indications of unusual noise before the earthquake in the ULF band (0.01-20 Hz) as suggested may have preceded the 1989 ML 7.1 Loma Prieta earthquake. Nor do we see electric field changes similar to those suggested to occur before earthquakes of this magnitude from data in Greece. Uniform and variable slip piezomagnetic models of the earthquake, derived from strain, displacement, and seismic data, generate magnetic field perturbations that are consistent with those observed by the magnetometer array. A higher rate of longer-term magnetic field change, consistent with increased loading in the region, is apparent since 1993. This accompanied an increased rate of secular shear strain observed on a two-color EDM network and a small network of borehole tensor strainmeters and increased seismicity dominated by three M 4.5-5 earthquakes roughly a year apart in 1992, 1993, and 1994. Models incorporating all of these data indicate increased slip at depth in the region

  5. Evaluating the Real-time and Offline Performance of the Virtual Seismologist Earthquake Early Warning Algorithm

    NASA Astrophysics Data System (ADS)

    Cua, G.; Fischer, M.; Heaton, T.; Wiemer, S.

    2009-04-01

    The Virtual Seismologist (VS) algorithm is a Bayesian approach to regional, network-based earthquake early warning (EEW). Bayes' theorem as applied in the VS algorithm states that the most probable source estimates at any given time is a combination of contributions from relatively static prior information that does not change over the timescale of earthquake rupture and a likelihood function that evolves with time to take into account incoming pick and amplitude observations from the on-going earthquake. Potentially useful types of prior information include network topology or station health status, regional hazard maps, earthquake forecasts, and the Gutenberg-Richter magnitude-frequency relationship. The VS codes provide magnitude and location estimates once picks are available at 4 stations; these source estimates are subsequently updated each second. The algorithm predicts the geographical distribution of peak ground acceleration and velocity using the estimated magnitude and location and appropriate ground motion prediction equations; the peak ground motion estimates are also updated each second. Implementation of the VS algorithm in California and Switzerland is funded by the Seismic Early Warning for Europe (SAFER) project. The VS method is one of three EEW algorithms whose real-time performance is being evaluated and tested by the California Integrated Seismic Network (CISN) EEW project. A crucial component of operational EEW algorithms is the ability to distinguish between noise and earthquake-related signals in real-time. We discuss various empirical approaches that allow the VS algorithm to operate in the presence of noise. Real-time operation of the VS codes at the Southern California Seismic Network (SCSN) began in July 2008. On average, the VS algorithm provides initial magnitude, location, origin time, and ground motion distribution estimates within 17 seconds of the earthquake origin time. These initial estimate times are dominated by the time for 4

  6. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  7. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  8. a Collaborative Cyberinfrastructure for Earthquake Seismology

    NASA Astrophysics Data System (ADS)

    Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.

    2013-12-01

    One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this

  9. Impact of Earthquake Preperation Process On Hydrodeformation Field Evolution In The Caucasus

    NASA Astrophysics Data System (ADS)

    Melikadze, G.; Aliev, A.; Bendukidze, G.; Biagi, P. F.; Garalov, B.; Mirianashvili, V.

    The paper studies relation between geodeformation regime variations of underground water observed in boreholes and deformation processes in the Earth crust, asso- ciated with formation of earthquakes with M=3 and higher. Monitoring of hydro- geodeformation field (HGDF) has been carried out thanks to the on-purpose gen- eral network of Armenia, Azerbaijan, Georgia and Russia. The wells are uniformly distributed throughout the Caucasus and cover all principal geological blocks of the region. The paper deals with results associated with several earthquakes occured in Georgia and one in Azerbaijan. As the network comprises boreholes of different depths, varying from 250 m down to 3,500 m, preliminary calibration of the boreholes involved was carried out, based on evaluation of the water level variation due to known Earth tide effect. This was necessary for sensitivity evaluation and normalization of hydro-dynamic signals. Obtained data have been processed by means of spectral anal- ysis to dissect background field of disturbances from the valid signal. The processed data covered the period of 1991-1993 comprising the following 4 strong earthquakes of the Caucasus, namely in: Racha (1991, M=6.9), Java (1991, M=6.2), Barisakho (1992, M=6.5) and Talish (1993, M=5.6). Formation of the compression zone in the east Caucasus and that of extension in the western Georgia and north Caucasus was observed 7 months prior to Racha quake. Boundary between the above 2 zones passed along the known submeridional fault. The area where maximal gradient was observed, coincided with the joint of deep faults and appeared to be the place for origination of the earthquake. After the quake occurred, the zone of maximal gradient started to mi- grate towards East and residual deformations in HGDF have outlined source first of Java quake (on 15.06.1991), than that of Barisakho (on 23.10.1992) and Talish (on 2.10.1993) ones. Thus, HGDF indicated migration of the deformation field along the slope of

  10. Local Deformation Precursors of Large Earthquakes Derived from GNSS Observation Data

    NASA Astrophysics Data System (ADS)

    Kaftan, Vladimir; Melnikov, Andrey

    2017-12-01

    Research on deformation precursors of earthquakes was of immediate interest from the middle to the end of the previous century. The repeated conventional geodetic measurements, such as precise levelling and linear-angular networks, were used for the study. Many examples of studies referenced to strong seismic events using conventional geodetic techniques are presented in [T. Rikitake, 1976]. One of the first case studies of geodetic earthquake precursors was done by Yu.A. Meshcheryakov [1968]. Rare repetitions, insufficient densities and locations of control geodetic networks made difficult predicting future places and times of earthquakes occurrences. Intensive development of Global Navigation Satellite Systems (GNSS) during the recent decades makes research more effective. The results of GNSS observations in areas of three large earthquakes (Napa M6.1, USA, 2014; El Mayor Cucapah M7.2, USA, 2010; and Parkfield M6.0, USA, 2004) are treated and presented in the paper. The characteristics of land surface deformation before, during, and after earthquakes have been obtained. The results prove the presence of anomalous deformations near their epicentres. The temporal character of dilatation and shear strain changes show existence of spatial heterogeneity of deformation of the Earth’s surface from months to years before the main shock close to it and at some distance from it. The revealed heterogeneities can be considered as deformation precursors of strong earthquakes. According to historical data and proper research values of critical deformations which are offered to be used for seismic danger scale creation based on continuous GNSS observations are received in a reference to the mentioned large earthquakes. It is shown that the approach has restrictions owing to uncertainty of the moment in the beginning of deformation accumulation and the place of expectation of another seismic event. Verification and clarification of the derived conclusions are proposed.

  11. Products and Services Available from the Southern California Earthquake Data Center (SCEDC) and the Southern California Seismic Network (SCSN)

    NASA Astrophysics Data System (ADS)

    Yu, E.; Chen, S.; Chowdhury, F.; Bhaskaran, A.; Hutton, K.; Given, D.; Hauksson, E.; Clayton, R. W.

    2009-12-01

    The SCEDC archives continuous and triggered data from nearly 3000 data channels from 375 SCSN recorded stations. The SCSN and SCEDC process and archive an average of 12,000 earthquakes each year, contributing to the southern California earthquake catalog that spans from 1932 to present. The SCEDC provides public, searchable access to these earthquake parametric and waveform data through its website www.data.scec.org and through client applications such as STP, NETDC and DHI. New data products: ● The SCEDC is distributing synthetic waveform data from the 2008 ShakeOut scenario (Jones et al., USGS Open File Rep., 2008-1150) and (Graves et al. 2008; Geophys. Res. Lett.) This is a M 7.8 earthquake on the southern San Andreas fault. Users will be able to download 40 sps velocity waveforms in SAC format from the SCEDC website. The SCEDC is also distributing synthetic GPS data (Crowell et al., 2009; Seismo. Res. Letters.) for this scenario as well. ● The SCEDC has added a new web page to show the latest tomographic model of Southern California. This model is based on Tape et al., 2009 Science. New data services: ● The SCEDC is exporting data in QuakeML format. This is an xml format that has been adopted by the Advanced National Seismic System (ANSS). This data will also be available as a web service. ● The SCEDC is exporting data in StationXML format. This is an xml format created by the SCEDC and adopted by ANSS to fully describe station metadata. This data will also be available as a web service. ● The stp 1.6 client can now access both the SCEDC and the Northern California Earthquake Data Center (NCEDC) earthquake and waveform archives. In progress - SCEDC to distribute 1 sps GPS data in miniSEED format: ● As part of a NASA Advanced Information Systems Technology project in collaboration with Jet Propulsion Laboratory and Scripps Institution of Oceanography, the SCEDC will receive real time 1 sps streams of GPS displacement solutions from the California

  12. Creating global networks through an online engineering graduate programme

    NASA Astrophysics Data System (ADS)

    Murray, M. H.

    2011-03-01

    Internationally, the railway industry is facing a severe shortage of engineers with high-level, relevant, professional and technical knowledge and abilities, in particular amongst engineers involved in the design, construction and maintenance of railway infrastructure. A unique graduate level programme has been created to meet that global need via a fully online, distance education format. The development and operation of this Master of Engineering degree is proposed as a model of the process needed for industry-relevance, flexible delivery, international networking and professional development required for a successful graduate engineering programme in the twenty-first century. In particular, this paper demonstrates how a mix of new and more familiar technologies are utilised through a variety of tasks to overcome the huge distances and multiple time zones that separate the participants across a growing number of countries, successfully achieving close and sustained interaction amongst the participants and railway experts.

  13. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.

  14. Improved Detection of Local Earthquakes in the Vienna Basin (Austria), using Subspace Detectors

    NASA Astrophysics Data System (ADS)

    Apoloner, Maria-Theresia; Caffagni, Enrico; Bokelmann, Götz

    2016-04-01

    The Vienna Basin in Eastern Austria is densely populated and highly-developed; it is also a region of low to moderate seismicity, yet the seismological network coverage is relatively sparse. This demands improving our capability of earthquake detection by testing new methods, enlarging the existing local earthquake catalogue. This contributes to imaging tectonic fault zones for better understanding seismic hazard, also through improved earthquake statistics (b-value, magnitude of completeness). Detection of low-magnitude earthquakes or events for which the highest amplitudes slightly exceed the signal-to-noise-ratio (SNR), may be possible by using standard methods like the short-term over long-term average (STA/LTA). However, due to sparse network coverage and high background noise, such a technique may not detect all potentially recoverable events. Yet, earthquakes originating from the same source region and relatively close to each other, should be characterized by similarity in seismic waveforms, at a given station. Therefore, waveform similarity can be exploited by using specific techniques such as correlation-template based (also known as matched filtering) or subspace detection methods (based on the subspace theory). Matching techniques basically require a reference or template event, usually characterized by high waveform coherence in the array receivers, and high SNR, which is cross-correlated with the continuous data. Instead, subspace detection methods overcome in principle the necessity of defining template events as single events, but use a subspace extracted from multiple events. This approach theoretically should be more robust in detecting signals that exhibit a strong variability (e.g. because of source or magnitude). In this study we scan the continuous data recorded in the Vienna Basin with a subspace detector to identify additional events. This will allow us to estimate the increase of the seismicity rate in the local earthquake catalogue

  15. MyShake: Building a smartphone seismic network

    NASA Astrophysics Data System (ADS)

    Kong, Q.; Allen, R. M.; Schreier, L.

    2014-12-01

    We are in the process of building up a smartphone seismic network. In order to build this network, we did shake table tests to evaluate the performance of the smartphones as seismic recording instruments. We also conducted noise floor test to find the minimum earthquake signal we can record using smartphones. We added phone noises to the strong motion data from past earthquakes, and used these as an analogy dataset to test algorithms and to understand the difference of using the smartphone network and the traditional seismic network. We also built a prototype system to trigger the smartphones from our server to record signals which can be sent back to the server in near real time. The phones can also be triggered by our developed algorithm running locally on the phone, if there's an earthquake occur to trigger the phones, the signal recorded by the phones will be sent back to the server. We expect to turn the prototype system into a real smartphone seismic network to work as a supplementary network to the existing traditional seismic network.

  16. Community Seismic Network (CSN)

    NASA Astrophysics Data System (ADS)

    Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Cheng, M.; Guy, R.; Chandy, M.; Krause, A.; Bunn, J.; Olson, M.; Faulkner, M.

    2011-12-01

    The CSN is a network of low-cost accelerometers deployed in the Pasadena, CA region. It is a prototype network with the goal of demonstrating the importance of dense measurements in determining the rapid lateral variations in ground motion due to earthquakes. The main product of the CSN is a map of peak ground produced within seconds of significant local earthquakes that can be used as a proxy for damage. Examples of this are shown using data from a temporary network in Long Beach, CA. Dense measurements in buildings are also being used to determine the state of health of structures. In addition to fixed sensors, portable sensors such as smart phones are also used in the network. The CSN has necessitated several changes in the standard design of a seismic network. The first is that the data collection and processing is done in the "cloud" (Google cloud in this case) for robustness and the ability to handle large impulsive loads (earthquakes). Second, the database is highly de-normalized (i.e. station locations are part of waveform and event-detection meta data) because of the mobile nature of the sensors. Third, since the sensors are hosted and/or owned by individuals, the privacy of the data is very important. The location of fixed sensors is displayed on maps as sensor counts in block-wide cells, and mobile sensors are shown in a similar way, with the additional requirement to inhibit tracking that at least two must be present in a particular cell before any are shown. The raw waveform data are only released to users outside of the network after a felt earthquake.

  17. The Self-Organising Seismic Early Warning Information Network

    NASA Astrophysics Data System (ADS)

    Kühnlenz, F.; Eveslage, I.; Fischer, J.; Fleming, K. M.; Lichtblau, B.; Milkereit, C.; Picozzi, M.

    2009-12-01

    job in a shorter time and with less manpower compared to using common seismic stations as we could see during the L'Aquila earthquake, where SOSEWIN was used to monitor damaged buildings. We present here the graphical front-end of SOSEWIN in its usage for different scenarios. It belongs to a management infrastructure based on GIS and database technologies and therefore coupling with existing infrastructures should be simplified. Connecting the domain expert’s laptop running the management software with a SOSEWIN may be fulfilled via any arbitrary node in the network (on-site access) or via a gateway node from a remote location using the internet. The scenarios focus on the needs of certain domain experts (seismologists or maybe engineers) and include the planning of a network installation, support during the installation process and testing of this installation. Another scenario mentions monitoring aspects of an already installed SOSEWIN and finally a scenario deals with the visualization of the alarming protocol detecting an earthquake event and issuing an early warning.

  18. Using Earthquake Analysis to Expand the Oklahoma Fault Database

    NASA Astrophysics Data System (ADS)

    Chang, J. C.; Evans, S. C.; Walter, J. I.

    2017-12-01

    The Oklahoma Geological Survey (OGS) is compiling a comprehensive Oklahoma Fault Database (OFD), which includes faults mapped in OGS publications, university thesis maps, and industry-contributed shapefiles. The OFD includes nearly 20,000 fault segments, but the work is far from complete. The OGS plans on incorporating other sources of data into the OFD, such as new faults from earthquake sequence analyses, geologic field mapping, active-source seismic surveys, and potential fields modeling. A comparison of Oklahoma seismicity and the OFD reveals that earthquakes in the state appear to nucleate on mostly unmapped or unknown faults. Here, we present faults derived from earthquake sequence analyses. From 2015 to present, there has been a five-fold increase in realtime seismic stations in Oklahoma, which has greatly expanded and densified the state's seismic network. The current seismic network not only improves our threshold for locating weaker earthquakes, but also allows us to better constrain focal plane solutions (FPS) from first motion analyses. Using nodal planes from the FPS, HypoDD relocation, and historic seismic data, we can elucidate these previously unmapped seismogenic faults. As the OFD is a primary resource for various scientific investigations, the inclusion of seismogenic faults improves further derivative studies, particularly with respect to seismic hazards. Our primal focus is on four areas of interest, which have had M5+ earthquakes in recent Oklahoma history: Pawnee (M5.8), Prague (M5.7), Fairview (M5.1), and Cushing (M5.0). Subsequent areas of interest will include seismically active data-rich areas, such as the central and northcentral parts of the state.

  19. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  20. Quantifying capability of a local seismic network in terms of locations and focal mechanism solutions of weak earthquakes

    NASA Astrophysics Data System (ADS)

    Fojtíková, Lucia; Kristeková, Miriam; Málek, Jiří; Sokos, Efthimios; Csicsay, Kristián; Zahradník, Jiří

    2016-01-01

    Extension of permanent seismic networks is usually governed by a number of technical, economic, logistic, and other factors. Planned upgrade of the network can be justified by theoretical assessment of the network capability in terms of reliable estimation of the key earthquake parameters (e.g., location and focal mechanisms). It could be useful not only for scientific purposes but also as a concrete proof during the process of acquisition of the funding needed for upgrade and operation of the network. Moreover, the theoretical assessment can also identify the configuration where no improvement can be achieved with additional stations, establishing a tradeoff between the improvement and additional expenses. This paper presents suggestion of a combination of suitable methods and their application to the Little Carpathians local seismic network (Slovakia, Central Europe) monitoring epicentral zone important from the point of seismic hazard. Three configurations of the network are considered: 13 stations existing before 2011, 3 stations already added in 2011, and 7 new planned stations. Theoretical errors of the relative location are estimated by a new method, specifically developed in this paper. The resolvability of focal mechanisms determined by waveform inversion is analyzed by a recent approach based on 6D moment-tensor error ellipsoids. We consider potential seismic events situated anywhere in the studied region, thus enabling "mapping" of the expected errors. Results clearly demonstrate that the network extension remarkably decreases the errors, mainly in the planned 23-station configuration. The already made three-station extension of the network in 2011 allowed for a few real data examples. Free software made available by the authors enables similar application in any other existing or planned networks.

  1. Machine Learning Seismic Wave Discrimination: Application to Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Li, Zefeng; Meier, Men-Andrin; Hauksson, Egill; Zhan, Zhongwen; Andrews, Jennifer

    2018-05-01

    Performance of earthquake early warning systems suffers from false alerts caused by local impulsive noise from natural or anthropogenic sources. To mitigate this problem, we train a generative adversarial network (GAN) to learn the characteristics of first-arrival earthquake P waves, using 300,000 waveforms recorded in southern California and Japan. We apply the GAN critic as an automatic feature extractor and train a Random Forest classifier with about 700,000 earthquake and noise waveforms. We show that the discriminator can recognize 99.2% of the earthquake P waves and 98.4% of the noise signals. This state-of-the-art performance is expected to reduce significantly the number of false triggers from local impulsive noise. Our study demonstrates that GANs can discover a compact and effective representation of seismic waves, which has the potential for wide applications in seismology.

  2. GPS detection of ionospheric perturbation before the 13 February 2001, El Salvador earthquake

    NASA Astrophysics Data System (ADS)

    Plotkin, V. V.

    A large earthquake of M6.6 occurred on 13 February 2001 at 14:22:05 UT in El Salvador. We detected ionospheric perturbation before this earthquake using GPS data received from CORS network. Systematic decreases of ionospheric total electron content during two days before the earthquake onset were observed at set of stations near the earthquake location and probably in region of about 1000 km from epicenter. This result is consistent with that of investigators, which studied these phenomena with several observational techniques. However it is possible, that such TEC changes are simultaneously accompanied by changes due to solar wind parameters and Kp -index.

  3. Reliability analysis of C-130 turboprop engine components using artificial neural network

    NASA Astrophysics Data System (ADS)

    Qattan, Nizar A.

    In this study, we predict the failure rate of Lockheed C-130 Engine Turbine. More than thirty years of local operational field data were used for failure rate prediction and validation. The Weibull regression model and the Artificial Neural Network model including (feed-forward back-propagation, radial basis neural network, and multilayer perceptron neural network model); will be utilized to perform this study. For this purpose, the thesis will be divided into five major parts. First part deals with Weibull regression model to predict the turbine general failure rate, and the rate of failures that require overhaul maintenance. The second part will cover the Artificial Neural Network (ANN) model utilizing the feed-forward back-propagation algorithm as a learning rule. The MATLAB package will be used in order to build and design a code to simulate the given data, the inputs to the neural network are the independent variables, the output is the general failure rate of the turbine, and the failures which required overhaul maintenance. In the third part we predict the general failure rate of the turbine and the failures which require overhaul maintenance, using radial basis neural network model on MATLAB tool box. In the fourth part we compare the predictions of the feed-forward back-propagation model, with that of Weibull regression model, and radial basis neural network model. The results show that the failure rate predicted by the feed-forward back-propagation artificial neural network model is closer in agreement with radial basis neural network model compared with the actual field-data, than the failure rate predicted by the Weibull model. By the end of the study, we forecast the general failure rate of the Lockheed C-130 Engine Turbine, the failures which required overhaul maintenance and six categorical failures using multilayer perceptron neural network (MLP) model on DTREG commercial software. The results also give an insight into the reliability of the engine

  4. Materials, Processes, and Environmental Engineering Network

    NASA Technical Reports Server (NTRS)

    White, Margo M.

    1993-01-01

    Attention is given to the Materials, Processes, and Environmental Engineering Network (MPEEN), which was developed as a central holding facility for materials testing information generated by the Materials and Processes Laboratory of NASA-Marshall. It contains information from other NASA centers and outside agencies, and also includes the NASA Environmental Information System (NEIS) and Failure Analysis Information System (FAIS) data. The data base is NEIS, which is accessible through MPEEN. Environmental concerns are addressed regarding materials identified by the NASA Operational Environment Team (NOET) to be hazardous to the environment. The data base also contains the usage and performance characteristics of these materials.

  5. The 2011 Tohoku-oki Earthquake related to a large velocity gradient within the Pacific plate

    NASA Astrophysics Data System (ADS)

    Matsubara, Makoto; Obara, Kazushige

    2015-04-01

    We conduct seismic tomography using arrival time data picked by the high sensitivity seismograph network (Hi-net) operated by National Research Institute for Earth Science and Disaster Prevention (NIED). We used earthquakes off the coast outside the seismic network around the source region of the 2011 Tohoku-oki Earthquake with the centroid depth estimated from moment tensor inversion by NIED F-net (broadband seismograph network) as well as earthquakes within the seismic network determined by Hi-net. The target region, 20-48N and 120-148E, covers the Japanese Islands from Hokkaido to Okinawa. A total of manually picked 4,622,346 P-wave and 3,062,846 S-wave arrival times for 100,733 earthquakes recorded at 1,212 stations from October 2000 to August 2009 is available for use in the tomographic method. In the final iteration, we estimate the P-wave slowness at 458,234 nodes and the S-wave slowness at 347,037 nodes. The inversion reduces the root mean square of the P-wave traveltime residual from 0.455 s to 0.187 s and that of the S-wave data from 0.692 s to 0.228 s after eight iterations (Matsubara and Obara, 2011). Centroid depths are determined using a Green's function approach (Okada et al., 2004) such as in NIED F-net. For the events distant from the seismic network, the centroid depth is more reliable than that determined by NIED Hi-net, since there are no stations above the hypocenter. We determine the upper boundary of the Pacific plate based on the velocity structure and earthquake hypocentral distribution. The upper boundary of the low-velocity (low-V) oceanic crust corresponds to the plate boundary where thrust earthquakes are expected to occur. Where we do not observe low-V oceanic crust, we determine the upper boundary of the upper layer of the double seismic zone within high-V Pacific plate. We assume the depth at the Japan Trench as 7 km. We can investigate the velocity structure within the Pacific plate such as 10 km beneath the plate boundary since the

  6. First-Grade Engineers

    ERIC Educational Resources Information Center

    Bautista, Nazan Uludag; Peters, Kari Nichole

    2010-01-01

    Can students build a house that is cost effective and strong enough to survive strong winds, heavy rains, and earthquakes? First graders in Ms. Peter's classroom worked like engineers to answer this question. They participated in a design challenge that required them to plan like engineers and build strong and cost-effective houses that would fit…

  7. A radon-thoron isotope pair as a reliable earthquake precursor

    PubMed Central

    Hwa Oh, Yong; Kim, Guebuem

    2015-01-01

    Abnormal increases in radon (222Rn, half-life = 3.82 days) activity have occasionally been observed in underground environments before major earthquakes. However, 222Rn alone could not be used to forecast earthquakes since it can also be increased due to diffusive inputs over its lifetime. Here, we show that a very short-lived isotope, thoron (220Rn, half-life = 55.6 s; mean life = 80 s), in a cave can record earthquake signals without interference from other environmental effects. We monitored 220Rn together with 222Rn in air of a limestone-cave in Korea for one year. Unusually large 220Rn peaks were observed only in February 2011, preceding the 2011 M9.0 Tohoku-Oki Earthquake, Japan, while large 222Rn peaks were observed in both February 2011 and the summer. Based on our analyses, we suggest that the anomalous peaks of 222Rn and 220Rn activities observed in February were precursory signals related to the Tohoku-Oki Earthquake. Thus, the 220Rn-222Rn combined isotope pair method can present new opportunities for earthquake forecasting if the technique is extensively employed in earthquake monitoring networks around the world. PMID:26269105

  8. Earthquakes of Garhwal Himalaya region of NW Himalaya, India: A study of relocated earthquakes and their seismogenic source and stress

    NASA Astrophysics Data System (ADS)

    R, A. P.; Paul, A.; Singh, S.

    2017-12-01

    Since the continent-continent collision 55 Ma, the Himalaya has accommodated 2000 km of convergence along its arc. The strain energy is being accumulated at a rate of 37-44 mm/yr and releases at time as earthquakes. The Garhwal Himalaya is located at the western side of a Seismic Gap, where a great earthquake is overdue atleast since 200 years. This seismic gap (Central Seismic Gap: CSG) with 52% probability for a future great earthquake is located between the rupture zones of two significant/great earthquakes, viz. the 1905 Kangra earthquake of M 7.8 and the 1934 Bihar-Nepal earthquake of M 8.0; and the most recent one, the 2015 Gorkha earthquake of M 7.8 is in the eastern side of this seismic gap (CSG). The Garhwal Himalaya is one of the ideal locations of the Himalaya where all the major Himalayan structures and the Himalayan Seimsicity Belt (HSB) can ably be described and studied. In the present study, we are presenting the spatio-temporal analysis of the relocated local micro-moderate earthquakes, recorded by a seismicity monitoring network, which is operational since, 2007. The earthquake locations are relocated using the HypoDD (double difference hypocenter method for earthquake relocations) program. The dataset from July, 2007- September, 2015 have been used in this study to estimate their spatio-temporal relationships, moment tensor (MT) solutions for the earthquakes of M>3.0, stress tensors and their interactions. We have also used the composite focal mechanism solutions for small earthquakes. The majority of the MT solutions show thrust type mechanism and located near the mid-crustal-ramp (MCR) structure of the detachment surface at 8-15 km depth beneath the outer lesser Himalaya and higher Himalaya regions. The prevailing stress has been identified to be compressional towards NNE-SSW, which is the direction of relative plate motion between the India and Eurasia continental plates. The low friction coefficient estimated along with the stress inversions

  9. The 7.9 Denali Fault, Alaska Earthquake of November 3, 2002: Aftershock Locations, Moment Tensors and Focal Mechanisms from the Regional Seismic Network Data

    NASA Astrophysics Data System (ADS)

    Ratchkovski, N. A.; Hansen, R. A.; Kore, K. R.

    2003-04-01

    The largest earthquake ever recorded on the Denali fault system (magnitude 7.9) struck central Alaska on November 3, 2002. It was preceded by a magnitude 6.7 earthquake on October 23. This earlier earthquake and its zone of aftershocks were located ~20 km to the west of the 7.9 quake. Aftershock locations and surface slip observations from the 7.9 quake indicate that the rupture was predominately unilateral in the eastward direction. The geologists mapped a ~300-km-long rupture and measured maximum offsets of 8.8 meters. The 7.9 event ruptured three different faults. The rupture began on the northeast trending Susitna Glacier Thrust fault, a splay fault south of the Denali fault. Then the rupture transferred to the Denali fault and propagated eastward for 220 km. At about 143W the rupture moved onto the adjacent southeast-trending Totschunda fault and propagated for another 55 km. The cumulative length of the 6.7 and 7.9 aftershock zones along the Denali and Totschunda faults is about 380 km. The earthquakes were recorded and processed by the Alaska Earthquake Information Center (AEIC). The AEIC acquires and processes data from the Alaska Seismic Network, consisting of over 350 seismograph stations. Nearly 40 of these sites are equipped with the broad-band sensors, some of which also have strong motion sensors. The rest of the stations are either 1 or 3-component short-period instruments. The data from these stations are collected, processed and archived at the AEIC. The AEIC staff installed a temporary seismic network of 6 instruments following the 6.7 earthquake and an additional 20 stations following the 7.9 earthquake. Prior to the 7.9 Denali Fault event, the AEIC was locating 35 to 50 events per day. After the event, the processing load increased to over 300 events per day during the first week following the event. In this presentation, we will present and interpret the aftershock location patterns, first motion focal mechanism solutions, and regional seismic

  10. Twin ruptures grew to build up the giant 2011 Tohoku, Japan, earthquake.

    PubMed

    Maercklin, Nils; Festa, Gaetano; Colombelli, Simona; Zollo, Aldo

    2012-01-01

    The 2011 Tohoku megathrust earthquake had an unexpected size for the region. To image the earthquake rupture in detail, we applied a novel backprojection technique to waveforms from local accelerometer networks. The earthquake began as a small-size twin rupture, slowly propagating mainly updip and triggering the break of a larger-size asperity at shallower depths, resulting in up to 50 m slip and causing high-amplitude tsunami waves. For a long time the rupture remained in a 100-150 km wide slab segment delimited by oceanic fractures, before propagating further to the southwest. The occurrence of large slip at shallow depths likely favored the propagation across contiguous slab segments and contributed to build up a giant earthquake. The lateral variations in the slab geometry may act as geometrical or mechanical barriers finally controlling the earthquake rupture nucleation, evolution and arrest.

  11. Magnitude Estimation for Large Earthquakes from Borehole Recordings

    NASA Astrophysics Data System (ADS)

    Eshaghi, A.; Tiampo, K. F.; Ghofrani, H.; Atkinson, G.

    2012-12-01

    We present a simple and fast method for magnitude determination technique for earthquake and tsunami early warning systems based on strong ground motion prediction equations (GMPEs) in Japan. This method incorporates borehole strong motion records provided by the Kiban Kyoshin network (KiK-net) stations. We analyzed strong ground motion data from large magnitude earthquakes (5.0 ≤ M ≤ 8.1) with focal depths < 50 km and epicentral distances of up to 400 km from 1996 to 2010. Using both peak ground acceleration (PGA) and peak ground velocity (PGV) we derived GMPEs in Japan. These GMPEs are used as the basis for regional magnitude determination. Predicted magnitudes from PGA values (Mpga) and predicted magnitudes from PGV values (Mpgv) were defined. Mpga and Mpgv strongly correlate with the moment magnitude of the event, provided sufficient records for each event are available. The results show that Mpgv has a smaller standard deviation in comparison to Mpga when compared with the estimated magnitudes and provides a more accurate early assessment of earthquake magnitude. We test this new method to estimate the magnitude of the 2011 Tohoku earthquake and we present the results of this estimation. PGA and PGV from borehole recordings allow us to estimate the magnitude of this event 156 s and 105 s after the earthquake onset, respectively. We demonstrate that the incorporation of borehole strong ground-motion records immediately available after the occurrence of large earthquakes significantly increases the accuracy of earthquake magnitude estimation and the associated improvement in earthquake and tsunami early warning systems performance. Moment magnitude versus predicted magnitude (Mpga and Mpgv).

  12. Stability assessment of structures under earthquake hazard through GRID technology

    NASA Astrophysics Data System (ADS)

    Prieto Castrillo, F.; Boton Fernandez, M.

    2009-04-01

    This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding

  13. A new strategy for earthquake focal mechanisms using waveform-correlation-derived relative polarities and cluster analysis: Application to the 2014 Long Valley Caldera earthquake swarm

    USGS Publications Warehouse

    Shelly, David R.; Hardebeck, Jeanne L.; Ellsworth, William L.; Hill, David P.

    2016-01-01

    In microseismicity analyses, reliable focal mechanisms can typically be obtained for only a small subset of located events. We address this limitation here, presenting a framework for determining robust focal mechanisms for entire populations of very small events. To achieve this, we resolve relative P and S wave polarities between pairs of waveforms by using their signed correlation coefficients—a by-product of previously performed precise earthquake relocation. We then use cluster analysis to group events with similar patterns of polarities across the network. Finally, we apply a standard mechanism inversion to the grouped data, using either catalog or correlation-derived P wave polarity data sets. This approach has great potential for enhancing analyses of spatially concentrated microseismicity such as earthquake swarms, mainshock-aftershock sequences, and industrial reservoir stimulation or injection-induced seismic sequences. To demonstrate its utility, we apply this technique to the 2014 Long Valley Caldera earthquake swarm. In our analysis, 85% of the events (7212 out of 8494 located by Shelly et al. [2016]) fall within five well-constrained mechanism clusters, more than 12 times the number with network-determined mechanisms. Of the earthquakes we characterize, 3023 (42%) have magnitudes smaller than 0.0. We find that mechanism variations are strongly associated with corresponding hypocentral structure, yet mechanism heterogeneity also occurs where it cannot be resolved by hypocentral patterns, often confined to small-magnitude events. Small (5–20°) rotations between mechanism orientations and earthquake location trends persist when we apply 3-D velocity models and might reflect a geometry of en echelon, interlinked shear, and dilational faulting.

  14. Understanding intraplate earthquakes in Sweden: the where and why

    NASA Astrophysics Data System (ADS)

    Lund, Björn; Tryggvason, Ari; Chan, NeXun; Högdahl, Karin; Buhcheva, Darina; Bödvarsson, Reynir

    2016-04-01

    The Swedish National Seismic Network (SNSN) underwent a rapid expansion and modernization between the years 2000 - 2010. The number of stations increased from 6 to 65, all broadband or semi-broadband with higher than standard sensitivity and all transmitting data in real-time. This has lead to a significant increase in the number of detected earthquakes, with the magnitude of completeness being approximately ML 0.5 within the network. During the last 15 years some 7,300 earthquakes have been detected and located, which can be compared to the approximately 1,800 earthquakes in the Swedish catalog from 1375 to 1999. We have used the recent earthquake catalog and various antropogenic sources (e.g. mine blasts, quarry blasts and infrastructure construction blast) to derive low resolution 3D P- and S-wave velocity models for entire Sweden. Including the blasts provides a more even geographical distribution of sources as well as good constraints on the locations. The resolution of the derived velocity models is in the 20 km range in the well resolved areas. A fairly robust feature observed in the Vp/Vs ratio of the derived models is a difference between the Paleoproterozoic rocks belonging to the TIB (Transscanidinavian Igneous Belt) and the Svecofennian rocks east and north of this region (a Vp/Vs ratio about 1.72 prevail in the former compared to a value below 1.70 in the latter) at depths down to 15 km. All earthquakes occurring since 2000 have been relocated in the 3D velocity model. The results show very clear differences in how earthquakes occur in different parts of Sweden. In the north, north of approximately 64 degrees latitude, most earthquakes occur on or in the vicinity of the Holocene postglacial faults. From 64N to approximately 60N earthquake activity is concentrated along the northeast coast line, with some relation to the offset in the bedrock from the onshore area to the offshore Bay of Bothnia. In southern Sweden earthquake activity is more widely

  15. Seismotectonics of the May 19, 2011 Simav- Kutahya Earthquake Activity

    NASA Astrophysics Data System (ADS)

    Komec Mutlu, Ahu

    2014-05-01

    Aftershock sequence of May 19, 2011 Simav earthquake (Mw = 5.8) is relocated with a new 1-D seismic velocity model and focal mechanisms of largest aftershocks are determined. The May 19, 2011 Simav-Kutahya earthquake is occured in the most seismically active region of western Turkey. During six months after the mainshock, more than 5000 earthquakes are recorded and aftershocks followed over a period of almost two years. In this study, more than 7600 aftershocks occured between years 2011 and 2012 with magnitudes greater than 1.8 relocated. Waveform data is collected by 13 three component seismic stations from three different networks (Kandilli Observatory and Earthquake Research Institute (NEMC-National Earthquake Monitoring Center), Prime Ministry Disaster and Emergency Management Presidency, Department of Earthquake and Canakkale Onsekiz Mart University Geophysics Department). These seismic stations are deployed closer than 80 km epicentral distance in the Simav-Kutahya. Average crustal velocity and average crustal thickness for the region are computed as 5.68 km/sn and 37.6 km, respectively. The source mechanism of fifty aftershocks with magnitudes greater than 4.0 are derived from first motion P phases. Analysis of focal mechanisms indicate mainly normal fault motions with oblique slip.

  16. NRIAG's Effort to Mitigate Earthquake Disasters in Egypt Using GPS and Seismic Data

    NASA Astrophysics Data System (ADS)

    Mahmoud, Salah

    It has been estimated that, during historical time more than 50 million people have lost their lives in earthquakes during ground shaking, such as soil amplification and/or liquefaction, landslides and tsunamis or its immediate aftereffects, as fires. The distribution of population takes generally no account of earthquake risk, at least on a large scale. An earthquake may be large but not destructive, on the other hand, an earthquake may be destructive but not large. The absence of correlation is due to the fact that, great number of other factors entering into consideration: first of all, the location of the earthquake in relation to populated areas, also soil conditions and building constructions. Soil liquefaction has been identified as the underlying phenomenon for many ground failures, settlements and lateral spreads, which are a major cause of damage to soil structures and building foundations in many events. Egypt is suffered a numerous of destructive earthquakes as well as Kalabsha earthquake (1981, Mag 5.4) near Aswan city and the High dam, Dahshour earthquake (1992, Mag 5.9) near Cairo city and Aqaba earthquake (1995, Mag 7.2). As the category of earthquake damage includes all the phenomena related to the direct and indirect damages, the Egyptian authorities do a great effort to mitigate the earthquake disasters. The seismicity especially at the zones of high activity is investigated in details in order to obtain the active source zones not only by the Egyptian National Seismic Network (ENSN) but also by the local seismic networks at, Aswan, Hurghada, Aqaba, Abu Dabbab and Dabbaa. On the other hand the soil condition, soil amplification, soil structure interaction, liquefaction and seismic hazard are carried out in particular the urbanized areas and the region near the source zones. All these parameters are integrated to obtain the Egyptian building code which is valid to construct buildings resist damages and consequently mitigate the earthquake

  17. Earthquake Response of Concrete Gravity Dams Including Hydrodynamic and Foundation Interaction Effects,

    DTIC Science & Technology

    1980-01-01

    standard procedure for Analysis of all types of civil engineering struc- tures. Early in its development, it became apparent that this method had...unique potentialities in the evaluation of stress in dams, and many of its earliest civil engineering applications concerned special problems associated...with such structures [3,4]. The earliest dynamic finite element analyses of civil engineering structures involved the earthquake response analysis of

  18. The use of waveform shapes to automatically determine earthquake focal depth

    USGS Publications Warehouse

    Sipkin, S.A.

    2000-01-01

    Earthquake focal depth is an important parameter for rapidly determining probable damage caused by a large earthquake. In addition, it is significant both for discriminating between natural events and explosions and for discriminating between tsunamigenic and nontsunamigenic earthquakes. For the purpose of notifying emergency management and disaster relief organizations as well as issuing tsunami warnings, potential time delays in determining source parameters are particularly detrimental. We present a method for determining earthquake focal depth that is well suited for implementation in an automated system that utilizes the wealth of broadband teleseismic data that is now available in real time from the global seismograph networks. This method uses waveform shapes to determine focal depth and is demonstrated to be valid for events with magnitudes as low as approximately 5.5.

  19. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the

  20. Twitter Seismology: Earthquake Monitoring and Response in a Social World

    NASA Astrophysics Data System (ADS)

    Bowden, D. C.; Earle, P. S.; Guy, M.; Smoczyk, G.

    2011-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment USGS earthquake response products and the delivery of hazard information. The potential uses of Twitter for earthquake response include broadcasting earthquake alerts, rapidly detecting widely felt events, qualitatively assessing earthquake damage effects, communicating with the public, and participating in post-event collaboration. Several seismic networks and agencies are currently distributing Twitter earthquake alerts including the European-Mediterranean Seismological Centre (@LastQuake), Natural Resources Canada (@CANADAquakes), and the Indonesian meteorological agency (@infogempabmg); the USGS will soon distribute alerts via the @USGSted and @USGSbigquakes Twitter accounts. Beyond broadcasting alerts, the USGS is investigating how to use tweets that originate near the epicenter to detect and characterize shaking events. This is possible because people begin tweeting immediately after feeling an earthquake, and their short narratives and exclamations are available for analysis within 10's of seconds of the origin time. Using five months of tweets that contain the word "earthquake" and its equivalent in other languages, we generate a tweet-frequency time series. The time series clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a simple Short-Term-Average / Long-Term-Average algorithm similar to that commonly used to detect seismic phases. As with most auto-detection algorithms, the parameters can be tuned to catch more or less events at the cost of more or less false triggers. When tuned to a moderate sensitivity, the detector found 48 globally-distributed, confirmed seismic events with only 2 false triggers. A space-shuttle landing and "The Great California ShakeOut" caused the false triggers. This number of

  1. Evaluation of seismic performance of reinforced concrete (RC) buildings under near-field earthquakes

    NASA Astrophysics Data System (ADS)

    Moniri, Hassan

    2017-03-01

    Near-field ground motions are significantly severely affected on seismic response of structure compared with far-field ground motions, and the reason is that the near-source forward directivity ground motions contain pulse-long periods. Therefore, the cumulative effects of far-fault records are minor. The damage and collapse of engineering structures observed in the last decades' earthquakes show the potential of damage in existing structures under near-field ground motions. One important subject studied by earthquake engineers as part of a performance-based approach is the determination of demand and collapse capacity under near-field earthquake. Different methods for evaluating seismic structural performance have been suggested along with and as part of the development of performance-based earthquake engineering. This study investigated the results of illustrious characteristics of near-fault ground motions on the seismic response of reinforced concrete (RC) structures, by the use of Incremental Nonlinear Dynamic Analysis (IDA) method. Due to the fact that various ground motions result in different intensity-versus-response plots, this analysis is done again under various ground motions in order to achieve significant statistical averages. The OpenSees software was used to conduct nonlinear structural evaluations. Numerical modelling showed that near-source outcomes cause most of the seismic energy from the rupture to arrive in a single coherent long-period pulse of motion and permanent ground displacements. Finally, a vulnerability of RC building can be evaluated against pulse-like near-fault ground motions effects.

  2. Lessons Learned from Creating the Public Earthquake Resource Center at CERI

    NASA Astrophysics Data System (ADS)

    Patterson, G. L.; Michelle, D.; Johnston, A.

    2004-12-01

    The Center for Earthquake Research and Information (CERI) at the University of Memphis opened the Public Earthquake Resource Center (PERC) in May 2004. The PERC is an interactive display area that was designed to increase awareness of seismology, Earth Science, earthquake hazards, and earthquake engineering among the general public and K-12 teachers and students. Funding for the PERC is provided by the US Geological Survey, The NSF-funded Mid America Earthquake Center, and the University of Memphis, with input from the Incorporated Research Institutions for Seismology. Additional space at the facility houses local offices of the US Geological Survey. PERC exhibits are housed in a remodeled residential structure at CERI that was donated by the University of Memphis and the State of Tennessee. Exhibits were designed and built by CERI and US Geological Survey staff and faculty with the help of experienced museum display subcontractors. The 600 square foot display area interactively introduces the basic concepts of seismology, real-time seismic information, seismic network operations, paleoseismology, building response, and historical earthquakes. Display components include three 22" flat screen monitors, a touch sensitive monitor, 3 helicorder elements, oscilloscope, AS-1 seismometer, life-sized liquefaction trench, liquefaction shake table, and building response shake table. All displays include custom graphics, text, and handouts. The PERC website at www.ceri.memphis.edu/perc also provides useful information such as tour scheduling, ask a geologist, links to other institutions, and will soon include a virtual tour of the facility. Special consideration was given to address State science standards for teaching and learning in the design of the displays and handouts. We feel this consideration is pivotal to the success of any grass roots Earth Science education and outreach program and represents a valuable lesson that has been learned at CERI over the last several

  3. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems

    USGS Publications Warehouse

    Yashinsky, Mark

    1998-01-01

    This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.

  4. Seismological investigation of earthquakes in the New Madrid Seismic Zone. Final report, September 1986--December 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herrmann, R.B.; Nguyen, B.

    Earthquake activity in the New Madrid Seismic Zone had been monitored by regional seismic networks since 1975. During this time period, over 3,700 earthquakes have been located within the region bounded by latitudes 35{degrees}--39{degrees}N and longitudes 87{degrees}--92{degrees}W. Most of these earthquakes occur within a 1.5{degrees} x 2{degrees} zone centered on the Missouri Bootheel. Source parameters of larger earthquakes in the zone and in eastern North America are determined using surface-wave spectral amplitudes and broadband waveforms for the purpose of determining the focal mechanism, source depth and seismic moment. Waveform modeling of broadband data is shown to be a powerful toolmore » in defining these source parameters when used complementary with regional seismic network data, and in addition, in verifying the correctness of previously published focal mechanism solutions.« less

  5. Accelerations from the September 5, 2012 (Mw=7.6) Nicoya, Costa Rica Earthquake

    NASA Astrophysics Data System (ADS)

    Simila, G. W.; Quintero, R.; Burgoa, B.; Mohammadebrahim, E.; Segura, J.

    2013-05-01

    Since 1984, the Seismic Network of the Volcanological and Seismological Observatory of Costa Rica, Universidad Nacional (OVSICORI-UNA) has been recording and registering the seismicity in Costa Rica. Before September 2012, the earthquakes registered by this seismic network in northwestern Costa Rica were moderate to small, except the Cóbano earthquake of March 25, 1990, 13:23, Mw 7.3, lat. 9.648, long. 84.913, depth 20 km; a subduction quake at the entrance of the Gulf of Nicoya and generated peak intensities in the range of MM = VIII near the epicentral area and VI-VII in the Central Valley of Costa Rica. Six years before the installation of the seismic network, OVSICORI-UNA registered two subduction earthquakes in northwestern Costa Rica, specifically on August 23, 1978, at 00:38:32 and 00:50:29 with magnitudes Mw 7.0 (HRVD), Ms 7.0 (ISC) and depths of 58 and 69 km, respectively (EHB Bulletin). On September 5, 2012, at 14:42:02.8 UTC, the seismic network OVSICORI-UNA registered another large subduction earthquake in Nicoya peninsula, northwestern Costa Rica, located 29 km south of Samara, with a depth of 21 km and magnitude Mw 7.6, lat. 9.6392, long. 85.6167. This earthquake was caused by the subduction of the Cocos plate under the Caribbean plate in northwestern Costa Rica. This earthquake was felt throughout the country and also in much of Nicaragua. The instrumental intensity map for the Nicoya earthquake indicates that the earthquake was felt with an intensity of VII-VIII in the Puntarenas and Nicoya Peninsulas, in an area between Liberia, Cañas, Puntarenas, Cabo Blanco, Carrillo, Garza, Sardinal, and Tamarindo in Guanacaste; Nicoya city being the place where the maximum reported intensity of VIII is most notable. An intensity of VIII indicates that damage estimates are moderate to severe, and intensity VII indicates that damage estimates are moderate. According to the National Emergency Commission of Costa Rica, 371 affected communities were reported; most

  6. The 7.9 Denali Fault Earthquake: Aftershock Locations, Moment Tensors and Focal Mechanisms from the Regional Seismic Network Data

    NASA Astrophysics Data System (ADS)

    Ratchkovski, N. A.; Hansen, R. A.; Christensen, D.; Kore, K.

    2002-12-01

    The largest earthquake ever recorded on the Denali fault system (magnitude 7.9) struck central Alaska on November 3, 2002. It was preceded by a magnitude 6.7 foreshock on October 23. This earlier earthquake and its zone of aftershocks were located slightly to the west of the 7.9 quake. Aftershock locations and surface slip observations from the 7.9 quake indicate that the rupture was predominately unilateral in the eastward direction. Near Mentasta Lake, a village that experienced some of the worst damage in the quake, the surface rupture scar turns from the Denali fault to the adjacent Totschunda fault, which trends toward more southeasterly toward the Canadian border. Overall, the geologists found that measurable scarps indicate that the north side of the Denali fault moved to the east and vertically up relative to the south. Maximum offsets on the Denali fault were 8.8 meters at the Tok Highway cutoff, and were 2.2 meters on the Totschunda fault. The Alaska regional seismic network consists of over 250 station sites, operated by the Alaska Earthquake Information Center (AEIC), the Alaska Volcano Observatory (AVO), and the Pacific Tsunami Warning Center (PTWC). Over 25 sites are equipped with the broad-band sensors, some of which have in addition the strong motion sensors. The rest of the stations are either 1 or 3-component short-period instruments. The data from these stations are collected, processed and archived at the AEIC. The AEIC staff installed a temporary network with over 20 instruments following the 6.7 Nenana Mountain and the 7.9 events. Prior to the M 7.9 Denali Fault event, the automatic earthquake detection system at AEIC was locating between 15 and 30 events per day. After the event, the system had over 200-400 automatic locations per day for at least 10 days following the 7.9 event. The processing of the data is ongoing with the priority given to the larger events. The cumulative length of the 6.7 and 7.9 aftershock locations along the Denali

  7. Disaster mitigation science for Earthquakes and Tsunamis -For resilience society against natural disasters-

    NASA Astrophysics Data System (ADS)

    Kaneda, Y.; Takahashi, N.; Hori, T.; Kawaguchi, K.; Isouchi, C.; Fujisawa, K.

    2017-12-01

    Destructive natural disasters such as earthquakes and tsunamis have occurred frequently in the world. For instance, 2004 Sumatra Earthquake in Indonesia, 2008 Wenchuan Earthquake in China, 2010 Chile Earthquake and 2011 Tohoku Earthquake in Japan etc., these earthquakes generated very severe damages. For the reduction and mitigation of damages by destructive natural disasters, early detection of natural disasters and speedy and proper evacuations are indispensable. And hardware and software developments/preparations for reduction and mitigation of natural disasters are quite important. In Japan, DONET as the real time monitoring system on the ocean floor is developed and deployed around the Nankai trough seismogenic zone southwestern Japan. So, the early detection of earthquakes and tsunamis around the Nankai trough seismogenic zone will be expected by DONET. The integration of the real time data and advanced simulation researches will lead to reduce damages, however, in the resilience society, the resilience methods will be required after disasters. Actually, methods on restorations and revivals are necessary after natural disasters. We would like to propose natural disaster mitigation science for early detections, evacuations and restorations against destructive natural disasters. This means the resilience society. In natural disaster mitigation science, there are lots of research fields such as natural science, engineering, medical treatment, social science and literature/art etc. Especially, natural science, engineering and medical treatment are fundamental research fields for natural disaster mitigation, but social sciences such as sociology, geography and psychology etc. are very important research fields for restorations after natural disasters. Finally, to realize and progress disaster mitigation science, human resource cultivation is indispensable. We already carried out disaster mitigation science under `new disaster mitigation research project on Mega

  8. Knowledge engineering for temporal dependency networks as operations procedures. [in space communication

    NASA Technical Reports Server (NTRS)

    Fayyad, Kristina E.; Hill, Randall W., Jr.; Wyatt, E. J.

    1993-01-01

    This paper presents a case study of the knowledge engineering process employed to support the Link Monitor and Control Operator Assistant (LMCOA). The LMCOA is a prototype system which automates the configuration, calibration, test, and operation (referred to as precalibration) of the communications, data processing, metric data, antenna, and other equipment used to support space-ground communications with deep space spacecraft in NASA's Deep Space Network (DSN). The primary knowledge base in the LMCOA is the Temporal Dependency Network (TDN), a directed graph which provides a procedural representation of the precalibration operation. The TDN incorporates precedence, temporal, and state constraints and uses several supporting knowledge bases and data bases. The paper provides a brief background on the DSN, and describes the evolution of the TDN and supporting knowledge bases, the process used for knowledge engineering, and an analysis of the successes and problems of the knowledge engineering effort.

  9. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  10. An efficient repeating signal detector to investigate earthquake swarms

    NASA Astrophysics Data System (ADS)

    Skoumal, Robert J.; Brudzinski, Michael R.; Currie, Brian S.

    2016-08-01

    Repetitive earthquake swarms have been recognized as key signatures in fluid injection induced seismicity, precursors to volcanic eruptions, and slow slip events preceding megathrust earthquakes. We investigate earthquake swarms by developing a Repeating Signal Detector (RSD), a computationally efficient algorithm utilizing agglomerative clustering to identify similar waveforms buried in years of seismic recordings using a single seismometer. Instead of relying on existing earthquake catalogs of larger earthquakes, RSD identifies characteristic repetitive waveforms by rapidly identifying signals of interest above a low signal-to-noise ratio and then grouping based on spectral and time domain characteristics, resulting in dramatically shorter processing time than more exhaustive autocorrelation approaches. We investigate seismicity in four regions using RSD: (1) volcanic seismicity at Mammoth Mountain, California, (2) subduction-related seismicity in Oaxaca, Mexico, (3) induced seismicity in Central Alberta, Canada, and (4) induced seismicity in Harrison County, Ohio. In each case, RSD detects a similar or larger number of earthquakes than existing catalogs created using more time intensive methods. In Harrison County, RSD identifies 18 seismic sequences that correlate temporally and spatially to separate hydraulic fracturing operations, 15 of which were previously unreported. RSD utilizes a single seismometer for earthquake detection which enables seismicity to be quickly identified in poorly instrumented regions at the expense of relying on another method to locate the new detections. Due to the smaller computation overhead and success at distances up to ~50 km, RSD is well suited for real-time detection of low-magnitude earthquake swarms with permanent regional networks.

  11. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  12. Aftershock stress analysis of the April 2015 Mw 7.8 Gorkha earthquake from the NAMASTE project

    NASA Astrophysics Data System (ADS)

    Pant, M.; Velasco, A. A.; Karplus, M. S.; Patlan, E.; Ghosh, A.; Nabelek, J.; Kuna, V. M.; Sapkota, S. N.; Adhikari, L. B.; Klemperer, S. L.

    2016-12-01

    Continental collision between the Indian plate and the Eurasian plate, converging at 45 mm/yr, has uplifted the northern part of Nepal forming the Himalaya. Because of this convergence, the region has experienced large, devastating earthquakes, including the 1934 Mw 8.4 Nepal-Bihar earthquake and two recent earthquakes on April 25, 2015 Mw 7.8 (Gorkha earthquake) and May 12, 2015 Mw 7.2. These quakes killed thousands of people and caused billion dollars of property loss. Despite some recent geologic and geophysical studies of this area, many tectonic questions remain unanswered. Shortly after the Gorkha earthquake, we deployed a seismic network, NAMASTE (Nepal Array Measuring Aftershock Seismicity Trailing Earthquake), to study the aftershocks of these two large events. Our network included 45 different seismic stations (16 short period, 25 broadband, and 4 strong motion sensors) that spanned the Gorkha rupture area. The deployment extends from south of the Main Frontal Thrust (MFT) to the Main Central Thrust region (MCT), and it to recorded aftershocks for more than ten months from June 2015 to May 2016. We are leveraging high-precision earthquake locations by measuring and picking P-wave first-motion arrival polarity to develop a catalog of focal mechanisms for the larger aftershocks. We will use this catalog to correlate the seismicity and stress related of the Indo-Eurasian plate margin, hoping to address questions regarding the complex fault geometries and future earthquake hazards at this plate margin.

  13. Reverse Engineering Cellular Networks with Information Theoretic Methods

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Banga, Julio R.

    2013-01-01

    Building mathematical models of cellular networks lies at the core of systems biology. It involves, among other tasks, the reconstruction of the structure of interactions between molecular components, which is known as network inference or reverse engineering. Information theory can help in the goal of extracting as much information as possible from the available data. A large number of methods founded on these concepts have been proposed in the literature, not only in biology journals, but in a wide range of areas. Their critical comparison is difficult due to the different focuses and the adoption of different terminologies. Here we attempt to review some of the existing information theoretic methodologies for network inference, and clarify their differences. While some of these methods have achieved notable success, many challenges remain, among which we can mention dealing with incomplete measurements, noisy data, counterintuitive behaviour emerging from nonlinear relations or feedback loops, and computational burden of dealing with large data sets. PMID:24709703

  14. Site Effects Study In Athens (greece) Using The 7th September 1999 Earthquake Aftershock Sequence

    NASA Astrophysics Data System (ADS)

    Serpetsidaki, A.; Sokos, E.

    On 7 September 1999 at 11:56:50 GMT, an earthquake of Mw=5.9 occurred at Athens capital of Greece. The epicenter was located in the Northwest area of Parnitha Moun- tain at 18km distance from the city centre. This earthquake was one of the most de- structive in Greece during the modern times. The intensity of the earthquake reached IX in the Northwest territories of the city and caused the death of 143 people and seri- ous structural damage in many buildings. On the 13th of September the Seismological Laboratory of Patras University, installed a seismic network of 30 stations in order to observe the evolution of the aftershock sequence. This temporary seismic network remained in the area of Attika for 50 days and recorded a significant part of the af- tershock sequence. In this paper we use the high quality recordings of this network to investigate the influence of the surface geology to the seismic motion, on sites within the epicentral area, which suffered the most during this earthquake. We applied the horizontal-to-vertical (H/V) spectral ratio method on noise and on earthquake records and the obtained results exhibit very good agreement. Finally we compare the results with the geological conditions of the study area and the damage distribution. Most of the obtained amplification levels were low with an exemption in the site of Ano Liosia were a significant amount of damage was observed and the results indicate that the earthquake motion was amplified four times. Based on the above we conclude that the damages in the city of Athens were due to source effects rather than site effects.

  15. RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage

    NASA Astrophysics Data System (ADS)

    Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.

    2010-12-01

    RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second

  16. Relocation of micro-earthquakes in the Yeongdeok offshore area, Korea using local and Ocean bottom seismometers

    NASA Astrophysics Data System (ADS)

    HAN, M.; Kim, K. H.; Park, S. C.; Lin, P. P.; Chen, P.; Chang, H.; Jang, J. P.; Kuo, B. Y.; Liao, Y. C.

    2016-12-01

    Seismicity in the East Sea of Korea has been relatively high during the last four decades of instrumental earthquake observation period. Yeongdeok offshore area is probably the most seismically active area in the East Sea. This study analyzes seismic signals to detect micro-earthquakes and determine their precise earthquake hypocenters in the Yeoungdeok offshore area using data recorded by the Korea National Seismic Network (KNSN) and a temporary ocean bottom seismographic network (OBSN-PNU) operated by Korea Meteorological Administration and Pusan National University, respectively. Continuous waveform data recorded at four seismic stations in the study area of KNSN between January 2007 and July 2016 are inspected to detect any repeating earthquakes by applying a waveform cross-correlation detector. More than 1,600 events are triggered. Events outside the study area or in poor waveform quality are removed from further analysis. Approximately 500 earthquakes are selected, most of which have gone unreported because their magnitudes are lower than the detection threshold of the routine earthquake monitoring. Events in the study area are also under bad azimuthal coverage because all stations are located on land and thus biased to the west. OBSN-PNU comprised three ocean bottom seismometers and operated to observe micro-earthquakes in the study area between February and August 2016. The same technique applied to the KNSN data has been applied to the OBSN-PNU data to detect micro-earthquakes. Precise earthquake hypocenters are determined using phase arrival times and waveform similarities. Resultant hypocenters are clustered to form a few lineaments. They are compared to the local geological and geophysical features to understand micro-earthquake activity in the area.

  17. Source Mechanisms of Destructive Tsunamigenic Earthquakes occurred along the Major Subduction Zones

    NASA Astrophysics Data System (ADS)

    Yolsal-Çevikbilen, Seda; Taymaz, Tuncay; Ulutaş, Ergin

    2016-04-01

    Subduction zones, where an oceanic plate is subducted down into the mantle by tectonic forces, are potential tsunami locations. Many big, destructive and tsunamigenic earthquakes (Mw > 7.5) and high amplitude tsunami waves are observed along the major subduction zones particularly near Indonesia, Japan, Kuril and Aleutan Islands, Gulf of Alaska, Southern America. Not all earthquakes are tsunamigenic; in order to generate a tsunami, the earthquake must occur under or near the ocean, be large, and create significant vertical movements of the seafloor. It is also known that tsunamigenic earthquakes release their energy over a couple of minutes, have long source time functions and slow-smooth ruptures. In this study, we performed point-source inversions by using teleseismic long-period P- and SH- and broad-band P-waveforms recorded by the Federation of Digital Seismograph Networks (FDSN) and the Global Digital Seismograph Network (GDSN) stations. We obtained source mechanism parameters and finite-fault slip distributions of recent destructive ten earthquakes (Mw ≥ 7.5) by comparing the shapes and amplitudes of long period P- and SH-waveforms, recorded in the distance range of 30° - 90°, with synthetic waveforms. We further obtained finite-fault rupture histories of those earthquakes to determine the faulting area (fault length and width), maximum displacement, rupture duration and stress drop. We applied a new back-projection method that uses teleseismic P-waveforms to integrate the direct P-phase with reflected phases from structural discontinuities near the source, and customized it to estimate the spatio-temporal distribution of the seismic energy release of earthquakes. Inversion results exhibit that recent tsunamigenic earthquakes show dominantly thrust faulting mechanisms with small amount of strike-slip components. Their focal depths are also relatively shallow (h < 40 km). As an example, the September 16, 2015 Illapel (Chile) earthquake (Mw: 8.3; h: 26 km

  18. Impacts of Social Network on Therapeutic Community Participation: A Follow-up Survey of Data Gathered after Ya'an Earthquake.

    PubMed

    Li, Zhichao; Chen, Yao; Suo, Liming

    2015-01-01

    In recent years, natural disasters and the accompanying health risks have become more frequent, and rehabilitation work has become an important part of government performance. On one hand, social networks play an important role in participants' therapeutic community participation and physical & mental recovery. On the other hand, therapeutic communities with widespread participation can also contribute to community recovery after disaster. This paper described a field study in an earthquake-stricken area of Ya'an. A set of 3-stage follow-up data was obtained concerning with the villagers' participation in therapeutic community, social network status, demographic background, and other factors. The Hierarchical linear Model (HLM) method was used to investigate the determinants of social network on therapeutic community participation. First, social networks have significantly impacts on the annual changes of therapeutic community participation. Second, there were obvious differences in education between groups mobilized by the self-organization and local government. However, they all exerted the mobilization force through the acquaintance networks. Third, local cadre networks of villagers could negatively influence the activities of self-organized therapeutic community, while with positively influence in government-organized therapeutic activities. This paper suggests that relevant government departments need to focus more on the reconstruction and cultivation of villagers' social network and social capital in the process of post-disaster recovery. These findings contribute to better understandings of how social networks influence therapeutic community participation, and what role local government can play in post-disaster recovery and public health improvement after natural disasters.

  19. Impacts of Social Network on Therapeutic Community Participation: A Follow-up Survey of Data Gathered after Ya’an Earthquake

    PubMed Central

    LI, Zhichao; CHEN, Yao; SUO, Liming

    2015-01-01

    Abstract Background In recent years, natural disasters and the accompanying health risks have become more frequent, and rehabilitation work has become an important part of government performance. On one hand, social networks play an important role in participants’ therapeutic community participation and physical & mental recovery. On the other hand, therapeutic communities with widespread participation can also contribute to community recovery after disaster. Methods This paper described a field study in an earthquake-stricken area of Ya’an. A set of 3-stage follow-up data was obtained concerning with the villagers’ participation in therapeutic community, social network status, demographic background, and other factors. The Hierarchical linear Model (HLM) method was used to investigate the determinants of social network on therapeutic community participation. Results First, social networks have significantly impacts on the annual changes of therapeutic community participation. Second, there were obvious differences in education between groups mobilized by the self-organization and local government. However, they all exerted the mobilization force through the acquaintance networks. Third, local cadre networks of villagers could negatively influence the activities of self-organized therapeutic community, while with positively influence in government-organized therapeutic activities. Conclusion This paper suggests that relevant government departments need to focus more on the reconstruction and cultivation of villagers’ social network and social capital in the process of post-disaster recovery. These findings contribute to better understandings of how social networks influence therapeutic community participation, and what role local government can play in post-disaster recovery and public health improvement after natural disasters. PMID:26060778

  20. Rapid Characterization of Large Earthquakes in Chile

    NASA Astrophysics Data System (ADS)

    Barrientos, S. E.; Team, C.

    2015-12-01

    Chile, along 3000 km of it 4200 km long coast, is regularly affected by very large earthquakes (up to magnitude 9.5) resulting from the convergence and subduction of the Nazca plate beneath the South American plate. These megathrust earthquakes exhibit long rupture regions reaching several hundreds of km with fault displacements of several tens of meters. Minimum delay characterization of these giant events to establish their rupture extent and slip distribution is of the utmost importance for rapid estimations of the shaking area and their corresponding tsunami-genic potential evaluation, particularly when there are only few minutes to warn the coastal population for immediate actions. The task of a rapid evaluation of large earthquakes is accomplished in Chile through a network of sensors being implemented by the National Seismological Center of the University of Chile. The network is mainly composed approximately by one hundred broad-band and strong motion instruments and 130 GNSS devices; all will be connected in real time. Forty units present an optional RTX capability, where satellite orbits and clock corrections are sent to the field device producing a 1-Hz stream at 4-cm level. Tests are being conducted to stream the real-time raw data to be later processed at the central facility. Hypocentral locations and magnitudes are estimated after few minutes by automatic processing software based on wave arrival; for magnitudes less than 7.0 the rapid estimation works within acceptable bounds. For larger events, we are currently developing automatic detectors and amplitude estimators of displacement coming out from the real time GNSS streams. This software has been tested for several cases showing that, for plate interface events, the minimum magnitude threshold detectability reaches values within 6.2 and 6.5 (1-2 cm coastal displacement), providing an excellent tool for earthquake early characterization from a tsunamigenic perspective.

  1. Integrated Land- and Underwater-Based Sensors for a Subduction Zone Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Pirenne, B.; Rosenberger, A.; Rogers, G. C.; Henton, J.; Lu, Y.; Moore, T.

    2016-12-01

    Ocean Networks Canada (ONC — oceannetworks.ca/ ) operates cabled ocean observatories off the coast of British Columbia (BC) to support research and operational oceanography. Recently, ONC has been funded by the Province of BC to deliver an earthquake early warning (EEW) system that integrates offshore and land-based sensors to deliver alerts of incoming ground shaking from the Cascadia Subduction Zone. ONC's cabled seismic network has the unique advantage of being located offshore on either side of the surface expression of the subduction zone. The proximity of ONC's sensors to the fault can result in faster, more effective warnings, which translates into more lives saved, injuries avoided and more ability for mitigative actions to take place.ONC delivers near real-time data from various instrument types simultaneously, providing distinct advantages to seismic monitoring and earthquake early warning. The EEW system consists of a network of sensors, located on the ocean floor and on land, that detect and analyze the initial p-wave of an earthquake as well as the crustal deformation on land during the earthquake sequence. Once the p-wave is detected and characterized, software systems correlate the data streams of the various sensors and deliver alerts to clients through a Common Alerting Protocol-compliant data package. This presentation will focus on the development of the earthquake early warning capacity at ONC. It will describe the seismic sensors and their distribution, the p-wave detection algorithms selected and the overall architecture of the system. It will further overview the plan to achieve operational readiness at project completion.

  2. Revisiting Notable Earthquakes and Seismic Patterns of the Past Decade in Alaska

    NASA Astrophysics Data System (ADS)

    Ruppert, N. A.; Macpherson, K. A.; Holtkamp, S. G.

    2015-12-01

    Alaska, the most seismically active region of the United States, has produced five earthquakes with magnitudes greater than seven since 2005. The 2007 M7.2 and 2013 M7.0 Andreanof Islands earthquakes were representative of the most common source of significant seismic activity in the region, the Alaska-Aleutian megathrust. The 2013 M7.5 Craig earthquake, a strike-slip event on the Queen-Charlotte fault, occurred along the transform plate boundary in southeast Alaska. The largest earthquake of the past decade, the 2014 M7.9 Little Sitkin event in the western Aleutians, occurred at an intermediate depth and ruptured along a gently dipping fault through nearly the entire thickness of the subducted Pacific plate. Along with these major earthquakes, the Alaska Earthquake Center reported over 250,000 seismic events in the state over the last decade, and its earthquake catalog surpassed 500,000 events in mid-2015. Improvements in monitoring networks and processing techniques allowed an unprecedented glimpse into earthquake patterns in Alaska. Some notable recent earthquake sequences include the 2008 Kasatochi eruption, the 2006-2008 M6+ crustal earthquakes in the central and western Aleutians, the 2010 and 2015 Bering Sea earthquakes, the 2014 Noatak swarm, and the 2014 Minto earthquake sequence. In 2013, the Earthscope USArray project made its way into Alaska. There are now almost 40 new Transportable Array stations in Alaska along with over 20 upgraded sites. This project is changing the earthquake-monitoring scene in Alaska, lowering magnitude of completeness across large, newly instrumented parts of the state.

  3. The CE3R Network: current status and future perspectives

    NASA Astrophysics Data System (ADS)

    Lenhardt, Wolfgang; Pesaresi, Damiano; Živčić, Mladen; Costa, Giovanni; Kuk, Kresimir; Bondár, István; Duni, Llambro; Spacek, Petr

    2016-04-01

    In order to improve the monitoring of seismic activities in the border regions and to enhance the collaboration between countries and seismological institutions in Central Europe, the Environment Agency of the Slovenian Republic (ARSO), the Italian National Institute for Oceanography and Experimental Geophysics (OGS), the University of Trieste (UniTS) and the Austrian Central Institute for Meteorology and Geodynamics (ZAMG) established in 2001 the "South Eastern Alps Transfrontier Seismological Network". In May 2014 ARSO, OGS, UniTS and ZAMG agreed to formalize the transfrontier network, to name it "Central and East European Earthquake Research Network", (CE3RN or CE3R Network) in order to locate it geographically since cross-border networks can be established in other areas of the world and to expand their cooperation, including institutions in other countries. The University of Zagreb (UniZG) joined CE3RN in October 2014. The Kövesligethy Radó Seismological Observatory (KRSZO) of the Hungarian Academy of Sciences joined CE3RN in October 2015. The Institute of Geosciences, Energy, Water and Environment (IGEWE) of the Polytechnic University of Tirana joined CE3RN in November 2015. The Institute of Physics of the Earth (IPE) of the Masaryk University in Brno joined CE3RN in November 2015. CE3RN Parties intend to formalize and possibly extend their ongoing cooperation in the field of seismological data acquisition, exchange and use for seismological and earthquake engineering and civil protection purposes. The purpose of this cooperation is to retain and expand the existing cross-border network, specify the rules of conduct in the network management, improvements, extensions and enlargements, enhance seismological research in the region, and support civil protection activities. Since the formal establishment of CE3RN, several common projects have been completed, like the SeismoSAT project for the seismic data center connection over satellite funded by the Interreg

  4. Changes in the Seismicity and Focal Mechanism of Small Earthquakes Prior to an MS 6.7 Earthquake in the Central Aleutian Island Arc

    USGS Publications Warehouse

    Billington, Serena; Engdahl, E.R.; Price, Stephanie

    1981-01-01

    On November 4 1977, a magnitude Ms 6.7 (mb 5.7) shallow-focus thrust earthquake occurred in the vicinity of the Adak seismographic network in the central Aleutian island arc. The earthquake and its aftershock sequence occurred in an area that had not experienced a similar sequence since at least 1964. About 13 1/2 months before the main shock, the rate of occurrence of very small magnitude earthquakes increased abruptly in the immediate vicinity of the impending main shock. To search for possible variations in the focal mechanism of small events preceding the main shock, a method was developed that objectively combines first-motion data to generate composite focal-mechanism information about events occurring within a small source region. The method could not be successfully applied to the whole study area, but the results show that starting about 10 1/2 months before the November 1977 earthquake, there was a change in the mechanism of small- to moderate-sized earthquakes in the immediate vicinity of the hypocenter and possibly in other parts of the eventual aftershock zone, but not in the surrounding regions.

  5. W phase source inversion for moderate to large earthquakes (1990-2010)

    USGS Publications Warehouse

    Duputel, Zacharie; Rivera, Luis; Kanamori, Hiroo; Hayes, Gavin P.

    2012-01-01

    Rapid characterization of the earthquake source and of its effects is a growing field of interest. Until recently, it still took several hours to determine the first-order attributes of a great earthquake (e.g. Mw≥ 7.5), even in a well-instrumented region. The main limiting factors were data saturation, the interference of different phases and the time duration and spatial extent of the source rupture. To accelerate centroid moment tensor (CMT) determinations, we have developed a source inversion algorithm based on modelling of the W phase, a very long period phase (100–1000 s) arriving at the same time as the P wave. The purpose of this work is to finely tune and validate the algorithm for large-to-moderate-sized earthquakes using three components of W phase ground motion at teleseismic distances. To that end, the point source parameters of all Mw≥ 6.5 earthquakes that occurred between 1990 and 2010 (815 events) are determined using Federation of Digital Seismograph Networks, Global Seismographic Network broad-band stations and STS1 global virtual networks of the Incorporated Research Institutions for Seismology Data Management Center. For each event, a preliminary magnitude obtained from W phase amplitudes is used to estimate the initial moment rate function half duration and to define the corner frequencies of the passband filter that will be applied to the waveforms. Starting from these initial parameters, the seismic moment tensor is calculated using a preliminary location as a first approximation of the centroid. A full CMT inversion is then conducted for centroid timing and location determination. Comparisons with Harvard and Global CMT solutions highlight the robustness of W phase CMT solutions at teleseismic distances. The differences in Mw rarely exceed 0.2 and the source mechanisms are very similar to one another. Difficulties arise when a target earthquake is shortly (e.g. within 10 hr) preceded by another large earthquake, which disturbs the

  6. Regional Moment Tensor Analysis of Earthquakes in Iran for 2010 to 2017 Using In-Country Data

    NASA Astrophysics Data System (ADS)

    Graybeal, D.; Braunmiller, J.

    2017-12-01

    Located in the middle of the Arabia-Eurasia continental collision, Iran is one of the most tectonically diverse and seismically active countries in the world. Until recently, however, seismic source parameter studies had to rely on teleseismic data or on data from temporary local arrays, which limited the scope of investigations. Relatively new broadband seismic networks operated by the Iranian Institute of Engineering Seismology (IIEES) and the Iranian Seismological Center (IRSC) currently consist of more than 100 stations and allow, for the first time, routine three-component full-waveform regional moment tensor analysis of the numerous M≥4.0 earthquakes that occur throughout the country. We use openly available, in-country data and include data from nearby permanent broadband stations available through IRIS and EIDA to improve azimuthal coverage for events in border regions. For the period from 2010 to 2017, we have obtained about 500 moment tensors for earthquakes ranging from Mw=3.6 to 7.8. The resulting database provides a unique, detailed view of deformation styles and earthquake depths in Iran. Overall, we find mainly thrust and strike-slip mechanisms as expected considering the convergent tectonic setting. Our magnitudes (Mw) are slightly smaller than ML and mb but comparable to Mw as reported in global catalogs (USGS ANSS). Event depths average about 3 km shallower than in global catalogs and are well constrained considering the capability of regional waveforms to resolve earthquake depth. Our dataset also contains several large magnitude main shock-aftershock sequences from different tectonic provinces, including the 2012 Ahar-Varzeghan (Mw=6.4), 2013 Kaki (Mw=6.5), and 2014 Murmuri (Mw=6.2) earthquakes. The most significant result in terms of seismogenesis and seismic hazard is that the vast majority of earthquakes occur at shallow depth, not in deeper basement. Our findings indicate that more than 80% of crustal seismicity in Iran likely occurs at

  7. Anomalies of rupture velocity in deep earthquakes

    NASA Astrophysics Data System (ADS)

    Suzuki, M.; Yagi, Y.

    2010-12-01

    Explaining deep seismicity is a long-standing challenge in earth science. Deeper than 300 km, the occurrence rate of earthquakes with depth remains at a low level until ~530 km depth, then rises until ~600 km, finally terminate near 700 km. Given the difficulty of estimating fracture properties and observing the stress field in the mantle transition zone (410-660 km), the seismic source processes of deep earthquakes are the most important information for understanding the distribution of deep seismicity. However, in a compilation of seismic source models of deep earthquakes, the source parameters for individual deep earthquakes are quite varied [Frohlich, 2006]. Rupture velocities for deep earthquakes estimated using seismic waveforms range from 0.3 to 0.9Vs, where Vs is the shear wave velocity, a considerably wider range than the velocities for shallow earthquakes. The uncertainty of seismic source models prevents us from determining the main characteristics of the rupture process and understanding the physical mechanisms of deep earthquakes. Recently, the back projection method has been used to derive a detailed and stable seismic source image from dense seismic network observations [e.g., Ishii et al., 2005; Walker et al., 2005]. Using this method, we can obtain an image of the seismic source process from the observed data without a priori constraints or discarding parameters. We applied the back projection method to teleseismic P-waveforms of 24 large, deep earthquakes (moment magnitude Mw ≥ 7.0, depth ≥ 300 km) recorded since 1994 by the Data Management Center of the Incorporated Research Institutions for Seismology (IRIS-DMC) and reported in the U.S. Geological Survey (USGS) catalog, and constructed seismic source models of deep earthquakes. By imaging the seismic rupture process for a set of recent deep earthquakes, we found that the rupture velocities are less than about 0.6Vs except in the depth range of 530 to 600 km. This is consistent with the depth

  8. Earthquake Early Warning: A Prospective User's Perspective (Invited)

    NASA Astrophysics Data System (ADS)

    Nishenko, S. P.; Savage, W. U.; Johnson, T.

    2009-12-01

    With more than 25 million people at risk from high hazard faults in California alone, Earthquake Early Warning (EEW) presents a promising public safety and emergency response tool. EEW represents the real-time end of an earthquake information spectrum which also includes near real-time notifications of earthquake location, magnitude, and shaking levels; as well as geographic information system (GIS)-based products for compiling and visually displaying processed earthquake data such as ShakeMap and ShakeCast. Improvements to and increased multi-national implementation of EEW have stimulated interest in how such information products could be used in the future. Lifeline organizations, consisting of utilities and transportation systems, can use both onsite and regional EEW information as part of their risk management and public safety programs. Regional EEW information can provide improved situational awareness to system operators before automatic system protection devices activate, and allow trained personnel to take precautionary measures. On-site EEW is used for earthquake-actuated automatic gas shutoff valves, triggered garage door openers at fire stations, system controls, etc. While there is no public policy framework for preemptive, precautionary electricity or gas service shutdowns by utilities in the United States, gas shut-off devices are being required at the building owner level by some local governments. In the transportation sector, high-speed rail systems have already demonstrated the ‘proof of concept’ for EEW in several countries, and more EEW systems are being installed. Recently the Bay Area Rapid Transit District (BART) began collaborating with the California Integrated Seismic Network (CISN) and others to assess the potential benefits of EEW technology to mass transit operations and emergency response in the San Francisco Bay region. A key issue in this assessment is that significant earthquakes are likely to occur close to or within the BART

  9. Twin ruptures grew to build up the giant 2011 Tohoku, Japan, earthquake

    PubMed Central

    Maercklin, Nils; Festa, Gaetano; Colombelli, Simona; Zollo, Aldo

    2012-01-01

    The 2011 Tohoku megathrust earthquake had an unexpected size for the region. To image the earthquake rupture in detail, we applied a novel backprojection technique to waveforms from local accelerometer networks. The earthquake began as a small-size twin rupture, slowly propagating mainly updip and triggering the break of a larger-size asperity at shallower depths, resulting in up to 50 m slip and causing high-amplitude tsunami waves. For a long time the rupture remained in a 100–150 km wide slab segment delimited by oceanic fractures, before propagating further to the southwest. The occurrence of large slip at shallow depths likely favored the propagation across contiguous slab segments and contributed to build up a giant earthquake. The lateral variations in the slab geometry may act as geometrical or mechanical barriers finally controlling the earthquake rupture nucleation, evolution and arrest. PMID:23050093

  10. Metabolic network flux analysis for engineering plant systems.

    PubMed

    Shachar-Hill, Yair

    2013-04-01

    Metabolic network flux analysis (NFA) tools have proven themselves to be powerful aids to metabolic engineering of microbes by providing quantitative insights into the flows of material and energy through cellular systems. The development and application of NFA tools to plant systems has advanced in recent years and are yielding significant insights and testable predictions. Plants present substantial opportunities for the practical application of NFA but they also pose serious challenges related to the complexity of plant metabolic networks and to deficiencies in our knowledge of their structure and regulation. By considering the tools available and selected examples, this article attempts to assess where and how NFA is most likely to have a real impact on plant biotechnology. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Using Inspiration from Synaptic Plasticity Rules to Optimize Traffic Flow in Distributed Engineered Networks.

    PubMed

    Suen, Jonathan Y; Navlakha, Saket

    2017-05-01

    Controlling the flow and routing of data is a fundamental problem in many distributed networks, including transportation systems, integrated circuits, and the Internet. In the brain, synaptic plasticity rules have been discovered that regulate network activity in response to environmental inputs, which enable circuits to be stable yet flexible. Here, we develop a new neuro-inspired model for network flow control that depends only on modifying edge weights in an activity-dependent manner. We show how two fundamental plasticity rules, long-term potentiation and long-term depression, can be cast as a distributed gradient descent algorithm for regulating traffic flow in engineered networks. We then characterize, both by simulation and analytically, how different forms of edge-weight-update rules affect network routing efficiency and robustness. We find a close correspondence between certain classes of synaptic weight update rules derived experimentally in the brain and rules commonly used in engineering, suggesting common principles to both.

  12. Robust method to detect and locate local earthquakes by means of amplitude measurements.

    NASA Astrophysics Data System (ADS)

    del Puy Papí Isaba, María; Brückl, Ewald

    2016-04-01

    In this study we present a robust new method to detect and locate medium and low magnitude local earthquakes. This method is based on an empirical model of the ground motion obtained from amplitude data of earthquakes in the area of interest, which were located using traditional methods. The first step of our method is the computation of maximum resultant ground velocities in sliding time windows covering the whole period of interest. In the second step, these maximum resultant ground velocities are back-projected to every point of a grid covering the whole area of interest while applying the empirical amplitude - distance relations. We refer to these back-projected ground velocities as pseudo-magnitudes. The number of operating seismic stations in the local network equals the number of pseudo-magnitudes at each grid-point. Our method introduces the new idea of selecting the minimum pseudo-magnitude at each grid-point for further analysis instead of searching for a minimum of the L2 or L1 norm. In case no detectable earthquake occurred, the spatial distribution of the minimum pseudo-magnitudes constrains the magnitude of weak earthquakes hidden in the ambient noise. In the case of a detectable local earthquake, the spatial distribution of the minimum pseudo-magnitudes shows a significant maximum at the grid-point nearest to the actual epicenter. The application of our method is restricted to the area confined by the convex hull of the seismic station network. Additionally, one must ensure that there are no dead traces involved in the processing. Compared to methods based on L2 and even L1 norms, our new method is almost wholly insensitive to outliers (data from locally disturbed seismic stations). A further advantage is the fast determination of the epicenter and magnitude of a seismic event located within a seismic network. This is possible due to the method of obtaining and storing a back-projected matrix, independent of the registered amplitude, for each seismic

  13. A N-S fossil transform fault reactivated by the March 2, 2016 Mw7.8 southwest of Sumatra, Indonesia earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, H.; van der Lee, S.

    2016-12-01

    Warton Basin (WB) is characterized by N-S striking fossil transform faults and E-W trending extinct ridges. The 2016 Mw7.8 southwest of Sumatra earthquake, nearby the WB's center, was first imaged by back-projecting P-waves from three regional seismic networks in Europn, Japan, and Australia. Next, the rupture direction of the earthquake was further determined using the rupture directivity analysis to P-waves from the global seismic network (GSN). Finally, we inverting these GSN waveforms on a defined N-S striking vertical fault for a kinematic source model. The results show that the earthquake reactivates a 190 degree N-S striking vertical fossil transform fault and asymmetrically bilaterally ruptures a 65 km by 30 km asperity over 35 s. Specifically, the earthquake first bilaterally ruptures northward and southward at a speed of 1.0 km/s over the first 12 s, and then mainly rupture northward at a speed of 1.6 km/s. Compared with two previous M≥7.8 WB earthquakes, including the 2000 southern WB earthquake and 2012 Mw8.6 Sumatra earthquake, the lower seismic energy radiation efficiency and slower rupture velicity of the 2016 earthquake indicate the rupture of the earthquake is probably controlled by the warmer ambient slab and tectonic stress regime.

  14. A Crowdsourcing-based Taiwan Scientific Earthquake Reporting System

    NASA Astrophysics Data System (ADS)

    Liang, W. T.; Lee, J. C.; Lee, C. F.

    2017-12-01

    To collect immediately field observations for any earthquake-induced ground damages, such as surface fault rupture, landslide, rock fall, liquefaction, and landslide-triggered dam or lake, etc., we are developing an earthquake damage reporting system which particularly relies on school teachers as volunteers after taking a series of training courses organized by this project. This Taiwan Scientific Earthquake Reporting (TSER) system is based on the Ushahidi mapping platform, which has been widely used for crowdsourcing on different purposes. Participants may add an app-like icon for mobile devices to this website at https://ies-tser.iis.sinica.edu.tw. Right after a potential damaging earthquake occurred in the Taiwan area, trained volunteers will be notified/dispatched to the source area to carry out field surveys and to describe the ground damages through this system. If the internet is available, they may also upload some relevant images in the field right away. This collected information will be shared with all public after a quick screen by the on-duty scientists. To prepare for the next strong earthquake, we set up a specific project on TSER for sharing spectacular/remarkable geologic features wherever possible. This is to help volunteers get used to this system and share any teachable material on this platform. This experimental, science-oriented crowdsourcing system was launched early this year. Together with a DYFI-like intensity reporting system, Taiwan Quake-Catcher Network, and some online games and teaching materials, the citizen seismology has been much improved in Taiwan in the last decade. All these constructed products are now either operated or promoted at the Taiwan Earthquake Research Center (TEC). With these newly developed platforms and materials, we are aiming not only to raise the earthquake awareness and preparedness, but also to encourage public participation in earthquake science in Taiwan.

  15. HYPOELLIPSE; a computer program for determining local earthquake hypocentral parameters, magnitude, and first-motion pattern

    USGS Publications Warehouse

    Lahr, John C.

    1999-01-01

    This report provides Fortran source code and program manuals for HYPOELLIPSE, a computer program for determining hypocenters and magnitudes of near regional earthquakes and the ellipsoids that enclose the 68-percent confidence volumes of the computed hypocenters. HYPOELLIPSE was developed to meet the needs of U.S. Geological Survey (USGS) scientists studying crustal and sub-crustal earthquakes recorded by a sparse regional seismograph network. The program was extended to locate hypocenters of volcanic earthquakes recorded by seismographs distributed on and around the volcanic edifice, at elevations above and below the hypocenter. HYPOELLIPSE was used to locate events recorded by the USGS southern Alaska seismograph network from October 1971 to the early 1990s. Both UNIX and PC/DOS versions of the source code of the program are provided along with sample runs.

  16. Geodetic Imaging of the Earthquake Cycle

    NASA Astrophysics Data System (ADS)

    Tong, Xiaopeng

    In this dissertation I used Interferometric Synthetic Aperture Radar (InSAR) and Global Positioning System (GPS) to recover crustal deformation caused by earthquake cycle processes. The studied areas span three different types of tectonic boundaries: a continental thrust earthquake (M7.9 Wenchuan, China) at the eastern margin of the Tibet plateau, a mega-thrust earthquake (M8.8 Maule, Chile) at the Chile subduction zone, and the interseismic deformation of the San Andreas Fault System (SAFS). A new L-band radar onboard a Japanese satellite ALOS allows us to image high-resolution surface deformation in vegetated areas, which is not possible with older C-band radar systems. In particular, both the Wenchuan and Maule InSAR analyses involved L-band ScanSAR interferometry which had not been attempted before. I integrated a large InSAR dataset with dense GPS networks over the entire SAFS. The integration approach features combining the long-wavelength deformation from GPS with the short-wavelength deformation from InSAR through a physical model. The recovered fine-scale surface deformation leads us to better understand the underlying earthquake cycle processes. The geodetic slip inversion reveals that the fault slip of the Wenchuan earthquake is maximum near the surface and decreases with depth. The coseismic slip model of the Maule earthquake constrains the down-dip extent of the fault slip to be at 45 km depth, similar to the Moho depth. I inverted for the slip rate on 51 major faults of the SAFS using Green's functions for a 3-dimensional earthquake cycle model that includes kinematically prescribed slip events for the past earthquakes since the year 1000. A 60 km thick plate model with effective viscosity of 10 19 Pa · s is preferred based on the geodetic and geological observations. The slip rates recovered from the plate models are compared to the half-space model. The InSAR observation reveals that the creeping section of the SAFS is partially locked. This high

  17. Main shock and aftershock records of the 1999 Izmit and Duzce, Turkey earthquakes

    USGS Publications Warehouse

    Celebi, M.; Akkar, Sinan; Gulerce, U.; Sanli, A.; Bundock, H.; Salkin, A.

    2001-01-01

    The August 17, 1999 Izmit (Turkey) earthquake (Mw=7.4) will be remembered as one of the largest earthquakes of recent times that affected a large urban environment (U.S. Geological Survey, 1999). This significant event was followed by many significant aftershocks and another main event (Mw=7.2) that occurred on November 12, 1999 near Duzce (Turkey). The shaking that caused the widespread damage and destruction was recorded by a handful of accelerographs (~30) in the earthquake area operated by different networks. The characteristics of these records show that the recorded peak accelerations, shown in Figure 1, even those from near field stations, are smaller than expected (Çelebi, 1999, 2000). Following this main event, several organizations from Turkey, Japan, France and the USA deployed temporary accelerographs and other aftershock recording hardware. Thus, the number of recording stations in the earthquake affected area was quadrupled (~130). As a result, as seen in Figure 2, smaller magnitude aftershocks yielded larger peak accelerations, indicating that because of the sparse networks, recording of larger motions during the main shock of August 17, 1999 were possibly missed.

  18. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  19. Earthquake sources near Uturuncu Volcano

    NASA Astrophysics Data System (ADS)

    Keyson, L.; West, M. E.

    2013-12-01

    Uturuncu, located in southern Bolivia near the Chile and Argentina border, is a dacitic volcano that was last active 270 ka. It is a part of the Altiplano-Puna Volcanic Complex, which spans 50,000 km2 and is comprised of a series of ignimbrite flare-ups since ~23 ma. Two sets of evidence suggest that the region is underlain by a significant magma body. First, seismic velocities show a low velocity layer consistent with a magmatic sill below depths of 15-20 km. This inference is corroborated by high electrical conductivity between 10km and 30km. This magma body, the so called Altiplano-Puna Magma Body (APMB) is the likely source of volcanic activity in the region. InSAR studies show that during the 1990s, the volcano experienced an average uplift of about 1 to 2 cm per year. The deformation is consistent with an expanding source at depth. Though the Uturuncu region exhibits high rates of crustal seismicity, any connection between the inflation and the seismicity is unclear. We investigate the root causes of these earthquakes using a temporary network of 33 seismic stations - part of the PLUTONS project. Our primary approach is based on hypocenter locations and magnitudes paired with correlation-based relative relocation techniques. We find a strong tendency toward earthquake swarms that cluster in space and time. These swarms often last a few days and consist of numerous earthquakes with similar source mechanisms. Most seismicity occurs in the top 10 kilometers of the crust and is characterized by well-defined phase arrivals and significant high frequency content. The frequency-magnitude relationship of this seismicity demonstrates b-values consistent with tectonic sources. There is a strong clustering of earthquakes around the Uturuncu edifice. Earthquakes elsewhere in the region align in bands striking northwest-southeast consistent with regional stresses.

  20. Inversion for slip distribution using teleseismic P waveforms: North Palm Springs, Borah Peak, and Michoacan earthquakes

    USGS Publications Warehouse

    Mendoza, C.; Hartzell, S.H.

    1988-01-01

    We have inverted the teleseismic P waveforms recorded by stations of the Global Digital Seismograph Network for the 8 July 1986 North Palm Springs, California, the 28 October 1983 Borah Peak, Idaho, and the 19 September 1985 Michoacan, Mexico, earthquakes to recover the distribution of slip on each of the faults using a point-by-point inversion method with smoothing and positivity constraints. Results of the inversion indicate that the Global digital Seismograph Network data are useful for deriving fault dislocation models for moderate to large events. However, a wide range of frequencies is necessary to infer the distribution of slip on the earthquake fault. Although the long-period waveforms define the size (dimensions and seismic moment) of the earthquake, data at shorter period provide additional constraints on the variation of slip on the fault. Dislocation models obtained for all three earthquakes are consistent with a heterogeneous rupture process where failure is controlled largely by the size and location of high-strength asperity regions. -from Authors

  1. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  2. Recent damaging earthquakes in Japan, 2003-2008

    USGS Publications Warehouse

    Kayen, Robert E

    2008-01-01

    landslides resulted, in part, from heavy rain associated with Typhoon Tokage. The earthquake forced more than 100,000 people into temporary shelters, and as many as 10,000 displaced from their upland homes for several years. Total damages was estimated by Japanese authorities at US$40 billion, making this the second most costly disaster in history, after the 1995 Kobe earth-quake. The 2003 Tokachi-Oki earthquake was the third event of magnitude 8.0+ to strike the southeastern portion of Hokkaido in the last 50 years. The event produced tsunami run-ups along the shoreline of southern Hokkaido that reached maximum heights of 4 meters. Accelerations recorded by seismic networks of Hokkaido indicated a high intensity motion region from Hiroo area to Kushiro City, with a PGA values in the range of 0.35 to 0.6g. Despite high acceleration levels, the observed ground failure, liquefaction, structural, port, and lifeline damages were remarkably light.

  3. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  4. The Accelerometric Networks in Istanbul

    NASA Astrophysics Data System (ADS)

    Zulfikar, Can; Alcik, Hakan; Mert, Aydin; Tahtasizoglu, Bahar; Kafadar, Nafiz; Korkmaz, Ahmet; Ozel, Oguz; Erdik, Mustafa

    2010-05-01

    In recent years several strong motion networks have been established in Istanbul with a preparation purpose for future probable earthquake. This study addresses the introduction of current seismic networks and presentation of some recent results recorded in these networks. Istanbul Earthquake Early Warning System Istanbul Earthquake Early Warning System has ten strong motion stations which were installed as close as possible to Marmara Sea main fault zone. Continuous on-line data from these stations via digital radio modem provide early warning for potentially disastrous earthquakes. Considering the complexity of fault rupture and the short fault distances involved, a simple and robust Early Warning algorithm, based on the exceedance of specified threshold time domain amplitude levels is implemented. The current algorithm compares the band-pass filtered accelerations and the cumulative absolute velocity (CAV) with specified threshold levels. The bracketed CAV window values that will be put into practice are accepted as to be 0.20, 0.40 and 0.70 m/s for three alarm levels, respectively. Istanbul Earthquake Rapid Response System Istanbul Earthquake Rapid Response System has one hundred 18 bit-resolution strong motion accelerometers which were placed in quasi-free field locations (basement of small buildings) in the populated areas of the city, within an area of approximately 50x30km, to constitute a network that will enable early damage assessment and rapid response information after a damaging earthquake. Early response information is achieved through fast acquisition and analysis of processed data obtained from the network. The stations are routinely interrogated on regular basis by the main data center. After triggered by an earthquake, each station processes the streaming strong motion data to yield the spectral accelerations at specific periods and sends these parameters in the form of SMS messages at every 20s directly to the main data center through a

  5. Infrasound Signal Characteristics from Small Earthquakes

    DTIC Science & Technology

    2010-09-01

    INFRASOUND SIGNAL CHARACTERISTICS FROM SMALL EARTHQUAKES J. Mark Hale1, Stephen J. Arrowsmith2, Chris Hayward3, Relu Burlacu1, Kristine L. Pankow1...ABSTRACT Understanding the source properties responsible for infrasound generation is critical to developing a seismo-acoustic data discriminant...mining in the Utah region create a unique setting for the study of near-field infrasound . The Utah network has been operating three permanent infrasound

  6. Prompt identification of tsunamigenic earthquakes from 3-component seismic data

    NASA Astrophysics Data System (ADS)

    Kundu, Ajit; Bhadauria, Y. S.; Basu, S.; Mukhopadhyay, S.

    2016-10-01

    An Artificial Neural Network (ANN) based algorithm for prompt identification of shallow focus (depth < 70 km) tsunamigenic earthquakes at a regional distance is proposed in the paper. The promptness here refers to decision making as fast as 5 min after the arrival of LR phase in the seismogram. The root mean square amplitudes of seismic phases recorded by a single 3-component station have been considered as inputs besides location and magnitude. The trained ANN has been found to categorize 100% of the new earthquakes successfully as tsunamigenic or non-tsunamigenic. The proposed method has been corroborated by an alternate mapping technique of earthquake category estimation. The second method involves computation of focal parameters, estimation of water volume displaced at the source and eventually deciding category of the earthquake. The method has been found to identify 95% of the new earthquakes successfully. Both the methods have been tested using three component broad band seismic data recorded at PALK (Pallekele, Sri Lanka) station provided by IRIS for earthquakes originating from Sumatra region of magnitude 6 and above. The fair agreement between the methods ensures that a prompt alert system could be developed based on proposed method. The method would prove to be extremely useful for the regions that are not adequately instrumented for azimuthal coverage.

  7. Evaluation of the real-time earthquake information system in Japan

    NASA Astrophysics Data System (ADS)

    Nakamura, Hiromitsu; Horiuchi, Shigeki; Wu, Changjiang; Yamamoto, Shunroku; Rydelek, Paul A.

    2009-01-01

    The real-time earthquake information system (REIS) of the Japanese seismic network is developed for automatically determining earthquake parameters within a few seconds after the P-waves arrive at the closest stations using both the P-wave arrival times and the timing data that P-waves have not yet arrived at other stations. REIS results play a fundamental role in the real-time information for earthquake early warning in Japan. We show the rapidity and accuracy of REIS from the analysis of 4,050 earthquakes in three years since 2005; 44 percent of the first reports are issued within 5 seconds after the first P-wave arrival and 80 percent of the events have a difference in epicenter distance less than 20 km relative to manually determined locations. We compared the formal catalog to the estimated magnitude from the real-time analysis and found that 94 percent of the events had a magnitude difference of +/-1.0 unit.

  8. Hydrothermal response to a volcano-tectonic earthquake swarm, Lassen, California

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Shelly, David R.; Hsieh, Paul A.; Clor, Laura; P.H. Seward,; Evans, William C.

    2015-01-01

    The increasing capability of seismic, geodetic, and hydrothermal observation networks allows recognition of volcanic unrest that could previously have gone undetected, creating an imperative to diagnose and interpret unrest episodes. A November 2014 earthquake swarm near Lassen Volcanic National Park, California, which included the largest earthquake in the area in more than 60 years, was accompanied by a rarely observed outburst of hydrothermal fluids. Although the earthquake swarm likely reflects upward migration of endogenous H2O-CO2 fluids in the source region, there is no evidence that such fluids emerged at the surface. Instead, shaking from the modest sized (moment magnitude 3.85) but proximal earthquake caused near-vent permeability increases that triggered increased outflow of hydrothermal fluids already present and equilibrated in a local hydrothermal aquifer. Long-term, multiparametric monitoring at Lassen and other well-instrumented volcanoes enhances interpretation of unrest and can provide a basis for detailed physical modeling.

  9. Performance Parameters Analysis of an XD3P Peugeot Engine Using Artificial Neural Networks (ANN) Concept in MATLAB

    NASA Astrophysics Data System (ADS)

    Rangaswamy, T.; Vidhyashankar, S.; Madhusudan, M.; Bharath Shekar, H. R.

    2015-04-01

    The current trends of engineering follow the basic rule of innovation in mechanical engineering aspects. For the engineers to be efficient, problem solving aspects need to be viewed in a multidimensional perspective. One such methodology implemented is the fusion of technologies from other disciplines in order to solve the problems. This paper mainly deals with the application of Neural Networks in order to analyze the performance parameters of an XD3P Peugeot engine (used in Ministry of Defence). The basic propaganda of the work is divided into two main working stages. In the former stage, experimentation of an IC engine is carried out in order to obtain the primary data. In the latter stage the primary database formed is used to design and implement a predictive neural network in order to analyze the output parameters variation with respect to each other. A mathematical governing equation for the neural network is obtained. The obtained polynomial equation describes the characteristic behavior of the built neural network system. Finally, a comparative study of the results is carried out.

  10. Dynamic traffic grooming with Spectrum Engineering (TG-SE) in flexible grid optical networks

    NASA Astrophysics Data System (ADS)

    Yu, Xiaosong; Zhao, Yongli; Zhang, Jiawei; Wang, Jianping; Zhang, Guoying; Chen, Xue; Zhang, Jie

    2015-12-01

    Flexible grid has emerged as an evolutionary technology to satisfy the ever increasing demand for higher spectrum efficiency and operational flexibility. To optimize the spectrum resource utilization, this paper introduces the concept of Spectrum Engineering in flex-grid optical networks. The sliceable optical transponder has been proposed to offload IP traffic to the optical layer and reduce the number of IP router ports and transponders. We discuss the impact of sliceable transponder in traffic grooming and propose several traffic-grooming schemes with Spectrum Engineering (TG-SE). Our results show that there is a tradeoff among different traffic grooming policies, which should be adopted based on the network operator's objectives. The proposed traffic grooming with Spectrum Engineering schemes can reduce OPEX as well as increase spectrum efficiency by efficiently utilizing the bandwidth variability and capability of sliceable optical transponders.

  11. Reverse engineering highlights potential principles of large gene regulatory network design and learning.

    PubMed

    Carré, Clément; Mas, André; Krouk, Gabriel

    2017-01-01

    Inferring transcriptional gene regulatory networks from transcriptomic datasets is a key challenge of systems biology, with potential impacts ranging from medicine to agronomy. There are several techniques used presently to experimentally assay transcription factors to target relationships, defining important information about real gene regulatory networks connections. These techniques include classical ChIP-seq, yeast one-hybrid, or more recently, DAP-seq or target technologies. These techniques are usually used to validate algorithm predictions. Here, we developed a reverse engineering approach based on mathematical and computer simulation to evaluate the impact that this prior knowledge on gene regulatory networks may have on training machine learning algorithms. First, we developed a gene regulatory networks-simulating engine called FRANK (Fast Randomizing Algorithm for Network Knowledge) that is able to simulate large gene regulatory networks (containing 10 4 genes) with characteristics of gene regulatory networks observed in vivo. FRANK also generates stable or oscillatory gene expression directly produced by the simulated gene regulatory networks. The development of FRANK leads to important general conclusions concerning the design of large and stable gene regulatory networks harboring scale free properties (built ex nihilo). In combination with supervised (accepting prior knowledge) support vector machine algorithm we (i) address biologically oriented questions concerning our capacity to accurately reconstruct gene regulatory networks and in particular we demonstrate that prior-knowledge structure is crucial for accurate learning, and (ii) draw conclusions to inform experimental design to performed learning able to solve gene regulatory networks in the future. By demonstrating that our predictions concerning the influence of the prior-knowledge structure on support vector machine learning capacity holds true on real data ( Escherichia coli K14 network

  12. Catalog of earthquake hypocenters at Alaskan volcanoes: January 1 through December 31, 2005

    USGS Publications Warehouse

    Dixon, James P.; Stihler, Scott D.; Power, John A.; Tytgat, Guy; Estes, Steve; McNutt, Stephen R.

    2006-01-01

    The Alaska Volcano Observatory (AVO), a cooperative program of the U.S. Geological Survey, the Geophysical Institute of the University of Alaska Fairbanks, and the Alaska Division of Geological and Geophysical Surveys, has maintained seismic monitoring networks at historically active volcanoes in Alaska since 1988 (Figure 1). The primary objectives of the seismic program are the real-time seismic monitoring of active, potentially hazardous, Alaskan volcanoes and the investigation of seismic processes associated with active volcanism. This catalog presents calculated earthquake hypocenters and seismic phase arrival data, and details changes in the seismic monitoring program for the period January 1 through December 31, 2005.The AVO seismograph network was used to monitor the seismic activity at thirty-two volcanoes within Alaska in 2005 (Figure 1). The network was augmented by two new subnetworks to monitor the Semisopochnoi Island volcanoes and Little Sitkin Volcano. Seismicity at these volcanoes was still being studied at the end of 2005 and has not yet been added to the list of permanently monitored volcanoes in the AVO weekly update. Following an extended period of monitoring to determine the background seismicity at the Mount Peulik, Ukinrek Maars, and Korovin Volcano, formal monitoring of these volcanoes began in 2005. AVO located 9,012 earthquakes in 2005.Monitoring highlights in 2005 include: (1) seismicity at Mount Spurr remaining above background, starting in February 2004, through the end of the year and into 2006; (2) an increase in seismicity at Augustine Volcano starting in May 2005, and continuing through the end of the year into 2006; (3) volcanic tremor and seismicity related to low-level strombolian activity at Mount Veniaminof in January to March and September; and (4) a seismic swarm at Tanaga Volcano in October and November.This catalog includes: (1) descriptions and locations of seismic instrumentation deployed in the field in 2005; (2) a

  13. New strong motion network in Georgia: basis for specifying seismic hazard

    NASA Astrophysics Data System (ADS)

    Kvavadze, N.; Tsereteli, N. S.

    2017-12-01

    Risk created by hazardous natural events is closely related to sustainable development of the society. Global observations have confirmed tendency of growing losses resulting from natural disasters, one of the most dangerous and destructive if which are earthquakes. Georgia is located in seismically active region. So, it is imperative to evaluate probabilistic seismic hazard and seismic risk with proper accuracy. National network of Georgia includes 35 station all of which are seismometers. There are significant gaps in strong motion recordings, which essential for seismic hazard assessment. To gather more accelerometer recordings, we have built a strong motion network distributed on the territory of Georgia. The network includes 6 stations for now, with Basalt 4x datalogger and strong motion sensor Episensor ES-T. For each site, Vs30 and soil resonance frequencies have been measured. Since all but one station (Tabakhmelam near Tbilisi), are located far from power and internet lines special system was created for instrument operation. Solar power is used to supply the system with electricity and GSM/LTE modems for internet access. VPN tunnel was set up using Raspberry pi, for two-way communication with stations. Tabakhmela station is located on grounds of Ionosphere Observatory, TSU and is used as a hub for the network. This location also includes a broadband seismometer and VLF electromagnetic waves observation antenna, for possible earthquake precursor studies. On server, located in Tabakhmela, the continues data is collected from all the stations, for later use. The recordings later will be used in different seismological and engineering problems, namely selecting and creating GMPE model for Caucasus, for probabilistic seismic hazard and seismic risk evaluation. These stations are a start and in the future expansion of strong motion network is planned. Along with this, electromagnetic wave observations will continue and additional antennas will be implemented

  14. From kinetic-structure analysis to engineering crystalline fiber networks in soft materials.

    PubMed

    Wang, Rong-Yao; Wang, Peng; Li, Jing-Liang; Yuan, Bing; Liu, Yu; Li, Li; Liu, Xiang-Yang

    2013-03-07

    Understanding the role of kinetics in fiber network microstructure formation is of considerable importance in engineering gel materials to achieve their optimized performances/functionalities. In this work, we present a new approach for kinetic-structure analysis for fibrous gel materials. In this method, kinetic data is acquired using a rheology technique and is analyzed in terms of an extended Dickinson model in which the scaling behaviors of dynamic rheological properties in the gelation process are taken into account. It enables us to extract the structural parameter, i.e. the fractal dimension, of a fibrous gel from the dynamic rheological measurement of the gelation process, and to establish the kinetic-structure relationship suitable for both dilute and concentrated gelling systems. In comparison to the fractal analysis method reported in a previous study, our method is advantageous due to its general validity for a wide range of fractal structures of fibrous gels, from a highly compact network of the spherulitic domains to an open fibrous network structure. With such a kinetic-structure analysis, we can gain a quantitative understanding of the role of kinetic control in engineering the microstructure of the fiber network in gel materials.

  15. USGS Tweet Earthquake Dispatch (@USGSted): Using Twitter for Earthquake Detection and Characterization

    NASA Astrophysics Data System (ADS)

    Liu, S. B.; Bouchard, B.; Bowden, D. C.; Guy, M.; Earle, P.

    2012-12-01

    The U.S. Geological Survey (USGS) is investigating how online social networking services like Twitter—a microblogging service for sending and reading public text-based messages of up to 140 characters—can augment USGS earthquake response products and the delivery of hazard information. The USGS Tweet Earthquake Dispatch (TED) system is using Twitter not only to broadcast seismically-verified earthquake alerts via the @USGSted and @USGSbigquakes Twitter accounts, but also to rapidly detect widely felt seismic events through a real-time detection system. The detector algorithm scans for significant increases in tweets containing the word "earthquake" or its equivalent in other languages and sends internal alerts with the detection time, tweet text, and the location of the city where most of the tweets originated. It has been running in real-time for 7 months and finds, on average, two or three felt events per day with a false detection rate of less than 10%. The detections have reasonable coverage of populated areas globally. The number of detections is small compared to the number of earthquakes detected seismically, and only a rough location and qualitative assessment of shaking can be determined based on Tweet data alone. However, the Twitter detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The main benefit of the tweet-based detections is speed, with most detections occurring between 19 seconds and 2 minutes from the origin time. This is considerably faster than seismic detections in poorly instrumented regions of the world. Going beyond the initial detection, the USGS is developing data mining techniques to continuously archive and analyze relevant tweets for additional details about the detected events. The information generated about an event is displayed on a web-based map designed using HTML5 for the mobile environment, which can be valuable when the user is not able to access a

  16. A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks

    PubMed Central

    Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng

    2009-01-01

    Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885

  17. Source Process of the Mw 5.0 Au Sable Forks, New York, Earthquake Sequence from Local Aftershock Monitoring Network Data

    NASA Astrophysics Data System (ADS)

    Kim, W.; Seeber, L.; Armbruster, J. G.

    2002-12-01

    On April 20, 2002, a Mw 5 earthquake occurred near the town of Au Sable Forks, northeastern Adirondacks, New York. The quake caused moderate damage (MMI VII) around the epicentral area and it is well recorded by over 50 broadband stations in the distance ranges of 70 to 2000 km in the Eastern North America. Regional broadband waveform data are used to determine source mechanism and focal depth using moment tensor inversion technique. Source mechanism indicates predominantly thrust faulting along 45° dipping fault plane striking due South. The mainshock is followed by at least three strong aftershocks with local magnitude (ML) greater than 3 and about 70 aftershocks are detected and located in the first three months by a 12-station portable seismographic network. The aftershock distribution clearly delineate the mainshock rupture to the westerly dipping fault plane at a depth of 11 to 12 km. Preliminary analysis of the aftershock waveform data indicates that orientation of the P-axis rotated 90° from that of the mainshock, suggesting a complex source process of the earthquake sequence. We achieved an important milestone in monitoring earthquakes and evaluating their hazards through rapid cross-border (Canada-US) and cross-regional (Central US-Northeastern US) collaborative efforts. Hence, staff at Instrument Software Technology, Inc. near the epicentral area joined Lamont-Doherty staff and deployed the first portable station in the epicentral area; CERI dispatched two of their technical staff to the epicentral area with four accelerometers and a broadband seismograph; the IRIS/PASSCAL facility shipped three digital seismographs and ancillary equipment within one day of the request; the POLARIS Consortium, Canada sent a field crew of three with a near real-time, satellite telemetry based earthquake monitoring system. The Polaris station, KSVO, powered by a solar panel and batteries, was already transmitting data to the central Hub in London, Ontario, Canada within

  18. Broadband records of earthquakes in deep gold mines and a comparison with results from SAFOD, California

    USGS Publications Warehouse

    McGarr, Arthur F.; Boettcher, M.; Fletcher, Jon Peter B.; Sell, Russell; Johnston, Malcolm J.; Durrheim, R.; Spottiswoode, S.; Milev, A.

    2009-01-01

    For one week during September 2007, we deployed a temporary network of field recorders and accelerometers at four sites within two deep, seismically active mines. The ground-motion data, recorded at 200 samples/sec, are well suited to determining source and ground-motion parameters for the mining-induced earthquakes within and adjacent to our network. Four earthquakes with magnitudes close to 2 were recorded with high signal/noise at all four sites. Analysis of seismic moments and peak velocities, in conjunction with the results of laboratory stick-slip friction experiments, were used to estimate source processes that are key to understanding source physics and to assessing underground seismic hazard. The maximum displacements on the rupture surfaces can be estimated from the parameter , where  is the peak ground velocity at a given recording site, and R is the hypocentral distance. For each earthquake, the maximum slip and seismic moment can be combined with results from laboratory friction experiments to estimate the maximum slip rate within the rupture zone. Analysis of the four M 2 earthquakes recorded during our deployment and one of special interest recorded by the in-mine seismic network in 2004 revealed maximum slips ranging from 4 to 27 mm and maximum slip rates from 1.1 to 6.3 m/sec. Applying the same analyses to an M 2.1 earthquake within a cluster of repeating earthquakes near the San Andreas Fault Observatory at Depth site, California, yielded similar results for maximum slip and slip rate, 14 mm and 4.0 m/sec.

  19. Programmable cells: Interfacing natural and engineered gene networks

    NASA Astrophysics Data System (ADS)

    Kobayashi, Hideki; Kærn, Mads; Araki, Michihiro; Chung, Kristy; Gardner, Timothy S.; Cantor, Charles R.; Collins, James J.

    2004-06-01

    Novel cellular behaviors and characteristics can be obtained by coupling engineered gene networks to the cell's natural regulatory circuitry through appropriately designed input and output interfaces. Here, we demonstrate how an engineered genetic circuit can be used to construct cells that respond to biological signals in a predetermined and programmable fashion. We employ a modular design strategy to create Escherichia coli strains where a genetic toggle switch is interfaced with: (i) the SOS signaling pathway responding to DNA damage, and (ii) a transgenic quorum sensing signaling pathway from Vibrio fischeri. The genetic toggle switch endows these strains with binary response dynamics and an epigenetic inheritance that supports a persistent phenotypic alteration in response to transient signals. These features are exploited to engineer cells that form biofilms in response to DNA-damaging agents and cells that activate protein synthesis when the cell population reaches a critical density. Our work represents a step toward the development of "plug-and-play" genetic circuitry that can be used to create cells with programmable behaviors. heterologous gene expression | synthetic biology | Escherichia coli

  20. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published