Sample records for earthquake engineering practice

  1. Reconfiguring practice: the interdependence of experimental procedure and computing infrastructure in distributed earthquake engineering.

    PubMed

    De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony

    2010-09-13

    When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.

  2. Important Earthquake Engineering Resources

    Science.gov Websites

    PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering

  3. Assessment of Simulated Ground Motions in Earthquake Engineering Practice: A Case Study for Duzce (Turkey)

    NASA Astrophysics Data System (ADS)

    Karimzadeh, Shaghayegh; Askan, Aysegul; Yakut, Ahmet

    2017-09-01

    Simulated ground motions can be used in structural and earthquake engineering practice as an alternative to or to augment the real ground motion data sets. Common engineering applications of simulated motions are linear and nonlinear time history analyses of building structures, where full acceleration records are necessary. Before using simulated ground motions in such applications, it is important to assess those in terms of their frequency and amplitude content as well as their match with the corresponding real records. In this study, a framework is outlined for assessment of simulated ground motions in terms of their use in structural engineering. Misfit criteria are determined for both ground motion parameters and structural response by comparing the simulated values against the corresponding real values. For this purpose, as a case study, the 12 November 1999 Duzce earthquake is simulated using stochastic finite-fault methodology. Simulated records are employed for time history analyses of frame models of typical residential buildings. Next, the relationships between ground motion misfits and structural response misfits are studied. Results show that the seismological misfits around the fundamental period of selected buildings determine the accuracy of the simulated responses in terms of their agreement with the observed responses.

  4. Research in seismology and earthquake engineering in Venezuela

    USGS Publications Warehouse

    Urbina, L.; Grases, J.

    1983-01-01

    After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article.   

  5. MCEER, from Earthquake Engineering to Extreme Events | Home Page

    Science.gov Websites

    Center Report Series Education Education Home Bridge Engineering Guest Speaker Series Connecte²d Teaching CSEE Graduate Student Poster Competition Earthquake Engineering Education in Haiti Earthquakes : FAQ's Engineering Seminar Series K-12 STEM Education National Engineers Week Research Experiences for

  6. Real-time earthquake monitoring using a search engine method.

    PubMed

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-12-04

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.

  7. Real-time earthquake monitoring using a search engine method

    PubMed Central

    Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong

    2014-01-01

    When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861

  8. Reduction of earthquake risk in the united states: Bridging the gap between research and practice

    USGS Publications Warehouse

    Hays, W.W.

    1998-01-01

    Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.

  9. Comparison of the sand liquefaction estimated based on codes and practical earthquake damage phenomena

    NASA Astrophysics Data System (ADS)

    Fang, Yi; Huang, Yahong

    2017-12-01

    Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.

  10. Designing an Earthquake-Proof Art Museum: An Arts- and Engineering-Integrated Science Lesson

    ERIC Educational Resources Information Center

    Carignan, Anastasia; Hussain, Mahjabeen

    2016-01-01

    In this practical arts-integrated science and engineering lesson, an inquiry-based approach was adopted to teach a class of fourth graders in a Midwest elementary school about the scientific concepts of plate tectonics and earthquakes. Lessons were prepared following the 5 E instructional model. Next Generation Science Standards (4-ESS3-2) and the…

  11. Revolutionising engineering education in the Middle East region to promote earthquake-disaster mitigation

    NASA Astrophysics Data System (ADS)

    Baytiyeh, Hoda; Naja, Mohamad K.

    2014-09-01

    Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enrol on engineering courses through lenient admission policies that do not compromise academic standards. This strategy has generated an influx of students who must be carefully educated to enhance their professional knowledge and social capital to assist in future earthquake-disaster risk-reduction efforts. However, the majority of Middle Eastern engineering students are unaware of the valuable acquired engineering skills and knowledge in building the resilience of their communities to earthquake disasters. As the majority of the countries in the Middle East are exposed to seismic hazards and are vulnerable to destructive earthquakes, engineers have become indispensable assets and the first line of defence against earthquake threats. This article highlights the contributions of some of the engineering innovations in advancing technologies and techniques for effective disaster mitigation and it calls for the incorporation of earthquake-disaster-mitigation education into academic engineering programmes in the Eastern Mediterranean region.

  12. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 10 Energy 1 2012-01-01 2012-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  13. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 10 Energy 1 2013-01-01 2013-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  14. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 10 Energy 1 2011-01-01 2011-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  15. 10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 10 Energy 1 2014-01-01 2014-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...

  16. Introduction: seismology and earthquake engineering in Mexico and Central and South America.

    USGS Publications Warehouse

    Espinosa, A.F.

    1982-01-01

    The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston

  17. Welcome to Pacific Earthquake Engineering Research Center - PEER

    Science.gov Websites

    Triggering and Effects at Silty Soil Sites" - PEER Research Project Highlight: "Dissipative Base ; Upcoming Events More June 10-13, 2018 Geotechnical Earthquake Engineering and Soil Dynamics V 2018 - Call

  18. Discussion of New Approaches to Medium-Short-Term Earthquake Forecast in Practice of The Earthquake Prediction in Yunnan

    NASA Astrophysics Data System (ADS)

    Hong, F.

    2017-12-01

    After retrospection of years of practice of the earthquake prediction in Yunnan area, it is widely considered that the fixed-point earthquake precursory anomalies mainly reflect the field information. The increase of amplitude and number of precursory anomalies could help to determine the original time of earthquakes, however it is difficult to obtain the spatial relevance between earthquakes and precursory anomalies, thus we can hardly predict the spatial locations of earthquakes using precursory anomalies. The past practices have shown that the seismic activities are superior to the precursory anomalies in predicting earthquakes locations, resulting from the increased seismicity were observed before 80% M=6.0 earthquakes in Yunnan area. While the mobile geomagnetic anomalies are turned out to be helpful in predicting earthquakes locations in recent year, for instance, the forecasted earthquakes occurring time and area derived form the 1-year-scale geomagnetic anomalies before the M6.5 Ludian earthquake in 2014 are shorter and smaller than which derived from the seismicity enhancement region. According to the past works, the author believes that the medium-short-term earthquake forecast level, as well as objective understanding of the seismogenic mechanisms, could be substantially improved by the densely laying observation array and capturing the dynamic process of physical property changes in the enhancement region of medium to small earthquakes.

  19. GeoMO 2008--geotechnical earthquake engineering : site response.

    DOT National Transportation Integrated Search

    2008-10-01

    The theme of GeoMO2008 has recently become of more interest to the Midwest civil engineering community due to the perceived earthquake risks and new code requirements. The constant seismic reminder for the New Madrid Seismic Zone and new USGS hazard ...

  20. The Lice, Turkey, earthquake of September 6, 1975; a preliminary engineering investigation

    USGS Publications Warehouse

    Yanev, P. I.

    1976-01-01

    The Fifth European Conference on Earthquake Engineering was held on September 22 through 25 in Istanbul, Turkey. The opening speech by the Honorable H. E. Nurettin Ok, Minister of Reconstruction and Resettlement of Turkey, introduced the several hundred delegates to the realities of earthquake hazards in Turkey:

  1. NGA West 2 | Pacific Earthquake Engineering Research Center

    Science.gov Websites

    , multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors

  2. EU H2020 SERA: Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe

    NASA Astrophysics Data System (ADS)

    Giardini, Domenico; Saleh, Kauzar; SERA Consortium, the

    2017-04-01

    SERA - Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe - is a new infrastructure project awarded in the last Horizon 2020 call for Integrating Activities for Advanced Communities (INFRAIA-01-2016-2017). Building up on precursor projects like NERA, SHARE, NERIES, SERIES, etc., SERA is expected to contribute significantly to the access of data, services and research infrastructures, and to develop innovative solutions in seismology and earthquake engineering, with the overall objective of reducing the exposure to risks associated to natural and anthropogenic earthquakes. For instance, SERA will revise the European Seismic Hazard reference model for input in the current revision of the Eurocode 8 on Seismic Design of Buildings; we also foresee to develop the first comprehensive framework for seismic risk modeling at European scale, and to develop new standards for future experimental observations and instruments for earthquake engineering and seismology. To that aim, SERA is engaging 31 institutions across Europe with leading expertise in the operation of research facilities, monitoring infrastructures, data repositories and experimental facilities in the fields of seismology, anthropogenic hazards and earthquake engineering. SERA comprises 26 activities, including 5 Networking Activities (NA) to improve the availability and access of data through enhanced community coordination and pooling of resources, 6 Joint Research Activities (JRA) aimed at creating new European standards for the optimal use of the data collected by the European infrastructures, Virtual Access (VA) to the 5 main European services for seismology and engineering seismology, and Trans-national Access (TA) to 10 high-class experimental facilities for earthquake engineering and seismology in Europe. In fact, around 50% of the SERA resources will be dedicated to virtual and transnational access. SERA and EPOS (European Platform Observing System, a European Research

  3. Simulating and analyzing engineering parameters of Kyushu Earthquake, Japan, 1997, by empirical Green function method

    NASA Astrophysics Data System (ADS)

    Li, Zongchao; Chen, Xueliang; Gao, Mengtan; Jiang, Han; Li, Tiefei

    2017-03-01

    Earthquake engineering parameters are very important in the engineering field, especially engineering anti-seismic design and earthquake disaster prevention. In this study, we focus on simulating earthquake engineering parameters by the empirical Green's function method. The simulated earthquake (MJMA6.5) occurred in Kyushu, Japan, 1997. Horizontal ground motion is separated as fault parallel and fault normal, in order to assess characteristics of two new direction components. Broadband frequency range of ground motion simulation is from 0.1 to 20 Hz. Through comparing observed parameters and synthetic parameters, we analyzed distribution characteristics of earthquake engineering parameters. From the comparison, the simulated waveform has high similarity with the observed waveform. We found the following. (1) Near-field PGA attenuates radically all around with strip radiation patterns in fault parallel while radiation patterns of fault normal is circular; PGV has a good similarity between observed record and synthetic record, but has different distribution characteristic in different components. (2) Rupture direction and terrain have a large influence on 90 % significant duration. (3) Arias Intensity is attenuating with increasing epicenter distance. Observed values have a high similarity with synthetic values. (4) Predominant period is very different in the part of Kyushu in fault normal. It is affected greatly by site conditions. (5) Most parameters have good reference values where the hypo-central is less than 35 km. (6) The GOF values of all these parameters are generally higher than 45 which means a good result according to Olsen's classification criterion. Not all parameters can fit well. Given these synthetic ground motion parameters, seismic hazard analysis can be performed and earthquake disaster analysis can be conducted in future urban planning.

  4. PEER - National Information Service for Earthquake Engineering - NISEE

    Science.gov Websites

    Information Service for Earthquake Engineering - NISEE The NISEE/PEER Library is an affiliated library of the Field Station, five miles from the main Berkeley campus. Address NISEE/PEER Library, University of Regents Hours Monday - Friday, 9:00 am - 1:00 pm Open to the public NISEE/PEER Library home page. 325

  5. Advancing Integrated STEM Learning through Engineering Design: Sixth-Grade Students' Design and Construction of Earthquake Resistant Buildings

    ERIC Educational Resources Information Center

    English, Lyn D.; King, Donna; Smeed, Joanna

    2017-01-01

    As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…

  6. Introduction: seismology and earthquake engineering in Central and South America.

    USGS Publications Warehouse

    Espinosa, A.F.

    1983-01-01

    Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author

  7. Teaching Engineering Practices

    NASA Astrophysics Data System (ADS)

    Cunningham, Christine M.; Carlsen, William S.

    2014-03-01

    Engineering is featured prominently in the Next Generation Science Standards (NGSS) and related reform documents, but how its nature and methods are described is problematic. This paper is a systematic review and critique of that representation, and proposes that the disciplinary core ideas of engineering (as described in the NGSS) can be disregarded safely if the practices of engineering are better articulated and modeled through student engagement in engineering projects. A clearer distinction between science and engineering practices is outlined, and prior research is described that suggests that precollege engineering design can strengthen children's understandings about scientific concepts. However, a piecemeal approach to teaching engineering practices is unlikely to result in students understanding engineering as a discipline. The implications for science teacher education are supplemented with lessons learned from a number of engineering education professional development projects.

  8. The Electronic Encyclopedia of Earthquakes

    NASA Astrophysics Data System (ADS)

    Benthien, M.; Marquis, J.; Jordan, T.

    2003-12-01

    The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will

  9. Performance-based seismic design of nonstructural building components: The next frontier of earthquake engineering

    NASA Astrophysics Data System (ADS)

    Filiatrault, Andre; Sullivan, Timothy

    2014-08-01

    With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major

  10. Potential utilization of the NASA/George C. Marshall Space Flight Center in earthquake engineering research

    NASA Technical Reports Server (NTRS)

    Scholl, R. E. (Editor)

    1979-01-01

    Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.

  11. The HayWired earthquake scenario—Engineering implications

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2018-04-18

    The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.

  12. Response of a 14-story Anchorage, Alaska, building in 2002 to two close earthquakes and two distant Denali fault earthquakes

    USGS Publications Warehouse

    Celebi, M.

    2004-01-01

    The recorded responses of an Anchorage, Alaska, building during four significant earthquakes that occurred in 2002 are studied. Two earthquakes, including the 3 November 2002 M7.9 Denali fault earthquake, with epicenters approximately 275 km from the building, generated long trains of long-period (>1 s) surface waves. The other two smaller earthquakes occurred at subcrustal depths practically beneath Anchorage and produced higher frequency motions. These two pairs of earthquakes have different impacts on the response of the building. Higher modes are more pronounced in the building response during the smaller nearby events. The building responses indicate that the close-coupling of translational and torsional modes causes a significant beating effect. It is also possible that there is some resonance occurring due to the site frequency being close to the structural frequency. Identification of dynamic characteristics and behavior of buildings can provide important lessons for future earthquake-resistant designs and retrofit of existing buildings. ?? 2004, Earthquake Engineering Research Institute.

  13. Living with earthquakes - development and usage of earthquake-resistant construction methods in European and Asian Antiquity

    NASA Astrophysics Data System (ADS)

    Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes

    2010-05-01

    Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the

  14. Practical Applications for Earthquake Scenarios Using ShakeMap

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Worden, B.; Quitoriano, V.; Goltz, J.

    2001-12-01

    In planning and coordinating emergency response, utilities, local government, and other organizations are best served by conducting training exercises based on realistic earthquake situations-ones that they are most likely to face. Scenario earthquakes can fill this role; they can be generated for any geologically plausible earthquake or for actual historic earthquakes. ShakeMap Web pages now display selected earthquake scenarios (www.trinet.org/shake/archive/scenario/html) and more events will be added as they are requested and produced. We will discuss the methodology and provide practical examples where these scenarios are used directly for risk reduction. Given a selected event, we have developed tools to make it relatively easy to generate a ShakeMap earthquake scenario using the following steps: 1) Assume a particular fault or fault segment will (or did) rupture over a certain length, 2) Determine the magnitude of the earthquake based on assumed rupture dimensions, 3) Estimate the ground shaking at all locations in the chosen area around the fault, and 4) Represent these motions visually by producing ShakeMaps and generating ground motion input for loss estimation modeling (e.g., FEMA's HAZUS). At present, ground motions are estimated using empirical attenuation relationships to estimate peak ground motions on rock conditions. We then correct the amplitude at that location based on the local site soil (NEHRP) conditions as we do in the general ShakeMap interpolation scheme. Finiteness is included explicitly, but directivity enters only through the empirical relations. Although current ShakeMap earthquake scenarios are empirically based, substantial improvements in numerical ground motion modeling have been made in recent years. However, loss estimation tools, HAZUS for example, typically require relatively high frequency (3 Hz) input for predicting losses, above the range of frequencies successfully modeled to date. Achieving full-synthetic ground motion

  15. Earthquake impact scale

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.

    2011-01-01

    With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should

  16. Activist engineering: changing engineering practice by deploying praxis.

    PubMed

    Karwat, Darshan M A; Eagle, Walter E; Wooldridge, Margaret S; Princen, Thomas E

    2015-02-01

    In this paper, we reflect on current notions of engineering practice by examining some of the motives for engineered solutions to the problem of climate change. We draw on fields such as science and technology studies, the philosophy of technology, and environmental ethics to highlight how dominant notions of apoliticism and ahistoricity are ingrained in contemporary engineering practice. We argue that a solely technological response to climate change does not question the social, political, and cultural tenet of infinite material growth, one of the root causes of climate change. In response to the contemporary engineering practice, we define an activist engineer as someone who not only can provide specific engineered solutions, but who also steps back from their work and tackles the question, What is the real problem and does this problem "require" an engineering intervention? Solving complex problems like climate change requires radical cultural change, and a significant obstacle is educating engineers about how to conceive of and create "authentic alternatives," that is, solutions that differ from the paradigm of "technologically improving" our way out of problems. As a means to realize radically new solutions, we investigate how engineers might (re)deploy the concept of praxis, which raises awareness in engineers of the inherent politics of technological design. Praxis empowers engineers with a more comprehensive understanding of problems, and thus transforms technologies, when appropriate, into more socially just and ecologically sensitive interventions. Most importantly, praxis also raises a radical alternative rarely considered-not "engineering a solution." Activist engineering offers a contrasting method to contemporary engineering practice and leads toward social justice and ecological protection through problem solving by asking not, How will we technologize our way out of the problems we face? but instead, What really needs to be done?

  17. Principles for selecting earthquake motions in engineering design of large dams

    USGS Publications Warehouse

    Krinitzsky, E.L.; Marcuson, William F.

    1983-01-01

    This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at

  18. Economic consequences of earthquakes: bridging research and practice with HayWired

    NASA Astrophysics Data System (ADS)

    Wein, A. M.; Kroll, C.

    2016-12-01

    The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.

  19. Analog earthquakes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hofmann, R.B.

    1995-09-01

    Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less

  20. Transportations Systems Modeling and Applications in Earthquake Engineering

    DTIC Science & Technology

    2010-07-01

    49 Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g)............... 50...Memphis, Tennessee. The NMSZ was responsible for the devastating 1811-1812 New Madrid earthquakes , the largest earthquakes ever recorded in the...Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g) Table 1 Fragility parameters for MSC steel bridge (Padgett 2007

  1. Earthquakes, Cities, and Lifelines: lessons integrating tectonics, society, and engineering in middle school Earth Science

    NASA Astrophysics Data System (ADS)

    Toke, N.; Johnson, A.; Nelson, K.

    2010-12-01

    Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions

  2. Engineering geological aspect of Gorkha Earthquake 2015, Nepal

    NASA Astrophysics Data System (ADS)

    Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen

    2016-04-01

    Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in

  3. Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Cavlazoglu, Baki; Stuessy, Carol

    2018-02-01

    The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.

  4. Training Course in Geotechnical and Foundation Engineering. Geotechnical Earthquake Engineering: Reference Manual. Chapters 4, Ground Motion Characterization, and 8, Liquefaction and Seismic Settlement.

    DOT National Transportation Integrated Search

    1998-12-01

    This manual was written to provide training on how to apply principles of geotechnical earthquake engineering to planning, design, and retrofit of highway facilities. Reproduced here are two chapters 4 and 8 in the settlement, respectively. These cha...

  5. Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program

    NASA Astrophysics Data System (ADS)

    Benthien, M. L.

    2003-12-01

    The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and

  6. Organizational changes at Earthquakes & Volcanoes

    USGS Publications Warehouse

    Gordon, David W.

    1992-01-01

    Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).

  7. POST Earthquake Debris Management - AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  8. Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems

    USGS Publications Warehouse

    Yashinsky, Mark

    1998-01-01

    This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.

  9. Engineering-geological model of the landslide of Güevejar (S Spain) reactivated by historical earthquakes

    NASA Astrophysics Data System (ADS)

    Delgado, José; García-Tortosa, Francisco J.; Garrido, Jesús; Giner, José; Lenti, Luca; López-Casado, Carlos; Martino, Salvatore; Peláez, José A.; Sanz de Galdeano, Carlos; Soler, Juan L.

    2015-04-01

    Landslides are a common ground effect induced by earthquakes of moderate to large magnitude. Most of them correspond to first-time instabilities induced by the seismic event, being the reactivation of pre-existing landslides less frequent in practice. The landslide of Güevejar (Granada province, S Spain) represents a case study of landslide that was reactivated, at least, two times by far field earthquakes: the Mw 8.7, 1755, Lisbon earthquake (with estimated epicentral distance of 680 km), and the Mw 6.5, 1884, Andalucia event (estimated epicentral distance of 45 km), but not by near field events of moderate magnitude (Mw < 6.0 and epicentral distances lower than 25 km). To study the seismic response of this landslide, a study has been conducted to elaborate an engineering-geological model. For this purpose, field work done included the elaboration of a detailed geological map (1:1000) of the landslide and surrounding areas, drilling of deep boreholes (80 m deep), down-hole measurement of both P and S wave velocities in the boreholes drilled, piezometric control of water table, MASW and ReMi profiles for determining the underlying structure of the sites tested (soil profile stratigraphy and the corresponding S-wave velocity of each soil level) and undisturbed sampling of the materials affected by the landslide. These samples were then tested in laboratory according to standard procedures for determination of both static (among which soil density, soil classification and shear strength) and dynamic properties (degradation curves for shear modulus and damping ratio with shear strain) of the landslide-involved materials. The model proposed corresponds to a complex landslide that combines a rototranslational mechanism with an earth-flow at its toe, which is characterized by a deep (> 50 m) sliding surface. The engineering-geological model constitutes the first step in an ongoing research devoted to understand how it could be reactivated during far field events. The

  10. Turkish Compulsory Earthquake Insurance (TCIP)

    NASA Astrophysics Data System (ADS)

    Erdik, M.; Durukal, E.; Sesetyan, K.

    2009-04-01

    Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.

  11. Chapter D. The Loma Prieta, California, Earthquake of October 17, 1989 - Earth Structures and Engineering Characterization of Ground Motion

    USGS Publications Warehouse

    Holzer, Thomas L.

    1998-01-01

    This chapter contains two papers that summarize the performance of engineered earth structures, dams and stabilized excavations in soil, and two papers that characterize for engineering purposes the attenuation of ground motion with distance during the Loma Prieta earthquake. Documenting the field performance of engineered structures and confirming empirically based predictions of ground motion are critical for safe and cost effective seismic design of future structures as well as the retrofitting of existing ones.

  12. A refined Frequency Domain Decomposition tool for structural modal monitoring in earthquake engineering

    NASA Astrophysics Data System (ADS)

    Pioldi, Fabio; Rizzi, Egidio

    2017-07-01

    Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.

  13. Fostering Creative Engineers: A Key to Face the Complexity of Engineering Practice

    ERIC Educational Resources Information Center

    Zhou, Chunfang

    2012-01-01

    Recent studies have argued a shift of thinking about engineering practice from a linear conception to a system understanding. The complexity of engineering practice has been thought of as the root of challenges for engineers. Moreover, creativity has been emphasised as one key capability that engineering students should master. This paper aims to…

  14. POST Earthquake Debris Management — AN Overview

    NASA Astrophysics Data System (ADS)

    Sarkar, Raju

    Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction

  15. Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures

    USGS Publications Warehouse

    Çelebi, Mehmet

    1998-01-01

    Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.

  16. Living on an Active Earth: Perspectives on Earthquake Science

    NASA Astrophysics Data System (ADS)

    Lay, Thorne

    2004-02-01

    The annualized long-term loss due to earthquakes in the United States is now estimated at $4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of $2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.

  17. Lessons Learned about Best Practices for Communicating Earthquake Forecasting and Early Warning to Non-Scientific Publics

    NASA Astrophysics Data System (ADS)

    Sellnow, D. D.; Sellnow, T. L.

    2017-12-01

    Earthquake scientists are without doubt experts in understanding earthquake probabilities, magnitudes, and intensities, as well as the potential consequences of them to community infrastructures and inhabitants. One critical challenge these scientific experts face, however, rests with communicating what they know to the people they want to help. Helping scientists translate scientific information to non-scientists is something Drs. Tim and Deanna Sellnow have been committed to for decades. As such, they have compiled a host of data-driven best practices for communicating effectively to non-scientific publics about earthquake forecasting, probabilities, and warnings. In this session, they will summarize what they have learned as it may help earthquake scientists, emergency managers, and other key spokespersons share these important messages to disparate publics in ways that result in positive outcomes, the most important of which is saving lives.

  18. The 2008 Wenchuan Earthquake and the Rise and Fall of Earthquake Prediction in China

    NASA Astrophysics Data System (ADS)

    Chen, Q.; Wang, K.

    2009-12-01

    Regardless of the future potential of earthquake prediction, it is presently impractical to rely on it to mitigate earthquake disasters. The practical approach is to strengthen the resilience of our built environment to earthquakes based on hazard assessment. But this was not common understanding in China when the M 7.9 Wenchuan earthquake struck the Sichuan Province on 12 May 2008, claiming over 80,000 lives. In China, earthquake prediction is a government-sanctioned and law-regulated measure of disaster prevention. A sudden boom of the earthquake prediction program in 1966-1976 coincided with a succession of nine M > 7 damaging earthquakes in the densely populated region of the country and the political chaos of the Cultural Revolution. It climaxed with the prediction of the 1975 Haicheng earthquake, which was due mainly to an unusually pronounced foreshock sequence and the extraordinary readiness of some local officials to issue imminent warning and evacuation order. The Haicheng prediction was a success in practice and yielded useful lessons, but the experience cannot be applied to most other earthquakes and cultural environments. Since the disastrous Tangshan earthquake in 1976 that killed over 240,000 people, there have been two opposite trends in China: decreasing confidence in prediction and increasing emphasis on regulating construction design for earthquake resilience. In 1976, most of the seismic intensity XI areas of Tangshan were literally razed to the ground, but in 2008, many buildings in the intensity XI areas of Wenchuan did not collapse. Prediction did not save life in either of these events; the difference was made by construction standards. For regular buildings, there was no seismic design in Tangshan to resist any earthquake shaking in 1976, but limited seismic design was required for the Wenchuan area in 2008. Although the construction standards were later recognized to be too low, those buildings that met the standards suffered much less

  19. The Road to Total Earthquake Safety

    NASA Astrophysics Data System (ADS)

    Frohlich, Cliff

    Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.

  20. Retrospective Perceptions and Views of Engineering Students about Physics and Engineering Practicals

    ERIC Educational Resources Information Center

    Bhathal, R.

    2011-01-01

    Hands-on practical work in physics and engineering has a long and well-established tradition in Australian universities. Recently, however, the question of whether hands-on physics and engineering practicals are useful for engineering students and whether they could be deleted or whether these could be replaced with computer simulations has been…

  1. Studying Science and Engineering Learning in Practice

    ERIC Educational Resources Information Center

    Penuel, William R.

    2016-01-01

    A key goal of science and engineering education is to provide opportunities for people to access, interpret, and make use of science and engineering to address practical human needs. Most education research, however, focuses on how best to prepare students in schools to participate in forms of science and engineering practices that resemble those…

  2. Reflections from the interface between seismological research and earthquake risk reduction

    NASA Astrophysics Data System (ADS)

    Sargeant, S.

    2012-04-01

    Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the

  3. Epistemic Practices of Engineering for Education

    ERIC Educational Resources Information Center

    Cunningham, Christine M.; Kelly, Gregory J.

    2017-01-01

    Engineering offers new educational opportunities for students, yet also poses challenges about how to conceptualize the disciplinary core ideas, crosscutting concepts, and science and engineering practices of the disciplinary fields of engineering. In this paper, we draw from empirical studies of engineering in professional and school settings to…

  4. Post-Earthquake Assessment of Nevada Bridges Using ShakeMap/ShakeCast

    DOT National Transportation Integrated Search

    2016-01-01

    Post-earthquake capacity of Nevada highway bridges is examined through a combination of engineering study and scenario earthquake evaluation. The study was undertaken by the University of Nevada Reno Department of Civil and Environmental Engineering ...

  5. Future of Earthquake Early Warning: Quantifying Uncertainty and Making Fast Automated Decisions for Applications

    NASA Astrophysics Data System (ADS)

    Wu, Stephen

    Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that

  6. The 1906 earthquake and a century of progress in understanding earthquakes and their hazards

    USGS Publications Warehouse

    Zoback, M.L.

    2006-01-01

    The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.

  7. Engineers' professional learning: a practice-theory perspective

    NASA Astrophysics Data System (ADS)

    Reich, Ann; Rooney, Donna; Gardner, Anne; Willey, Keith; Boud, David; Fitzgerald, Terry

    2015-07-01

    With the increasing challenges facing professional engineers working in more complex, global and interdisciplinary contexts, different approaches to understanding how engineers practice and learn are necessary. This paper draws on recent research in the social sciences from the field of workplace learning, to suggest that a practice-theory perspective on engineers' professional learning is fruitful. It shifts the focus from the attributes of the individual learner (knowledge, skills and attitudes) to the attributes of the practice (interactions, materiality, opportunities and challenges). Learning is thus more than the technical acquisition and transfer of knowledge, but a complex bundle of activities, that is, social, material, embodied and emerging. The paper is illustrated with examples from a research study of the learning of experienced engineers in the construction industry to demonstrate common practices - site walks and design review meetings - in which learning takes place.

  8. Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes

    USGS Publications Warehouse

    Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.

    2013-01-01

    The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.

  9. 77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-19

    ... is to discuss engineering needs for existing buildings, to review the National Earthquake Hazards... Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov/ . DATES: The... assesses: Trends and developments in the science and engineering of earthquake hazards reduction; The...

  10. Education for Professional Engineering Practice

    ERIC Educational Resources Information Center

    Bramhall, Mike D.; Short, Chris

    2014-01-01

    This paper reports on a funded collaborative large-scale curriculum innovation and enhancement project undertaken as part of a UK National Higher Education Science, Technology Engineering and Mathematics (STEM) programme. Its aim was to develop undergraduate curricula to teach appropriate skills for professional engineering practice more…

  11. Damaging earthquakes: A scientific laboratory

    USGS Publications Warehouse

    Hays, Walter W.; ,

    1996-01-01

    This paper reviews the principal lessons learned from multidisciplinary postearthquake investigations of damaging earthquakes throughout the world during the past 15 years. The unique laboratory provided by a damaging earthquake in culturally different but tectonically similar regions of the world has increased fundamental understanding of earthquake processes, added perishable scientific, technical, and socioeconomic data to the knowledge base, and led to changes in public policies and professional practices for earthquake loss reduction.

  12. Teaching through 10,000 Earthquakes: Constructive Practice for Instructors in a Post-Disaster Environment

    ERIC Educational Resources Information Center

    Wright, Sarah; Wordsworth, Russell

    2013-01-01

    The authors describe their experiences of teaching through a series of major earthquakes and the lessons learned regarding sustaining teaching and learning through an ongoing natural disaster. Student feedback data from across the university is analyzed to generate a model of constructive practice for instructors responding to a crisis. The…

  13. Engineers' Professional Learning: A Practice-Theory Perspective

    ERIC Educational Resources Information Center

    Reich, Ann; Rooney, Donna; Gardner, Anne; Willey, Keith; Boud, David; Fitzgerald, Terry

    2015-01-01

    With the increasing challenges facing professional engineers working in more complex, global and interdisciplinary contexts, different approaches to understanding how engineers practice and learn are necessary. This paper draws on recent research in the social sciences from the field of workplace learning, to suggest that a practice-theory…

  14. UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking

    USGS Publications Warehouse

    Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying

    2013-01-01

    The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.

  15. Special Issue "Impact of Natural Hazards on Urban Areas and Infrastructure" in the Bulletin of Earthquake Engineering

    NASA Astrophysics Data System (ADS)

    Bostenaru Dan, M.

    2009-04-01

    mitigation will be presented. The session includes contributions showing methodological and modelling approaches from scientists in geophysical/seismological, hydrological, remote sensing, civil engineering, insurance, and urbanism, amongst other fields, as well as presentations from practitioners working on specific case studies, regarding analysis of recent events and their impact on cities as well as re-evaluation of past events from the point of view of long-time recovery. In 2005 it was called for: Most strategies for both preparedness and emergency management in case of disaster mitigation are related to urban planning. While natural, engineering and social sciences contribute to the evaluation of the impact of earthquakes and their secondary events (including tsunamis, earthquake triggered landslides, or fire), floods, landslides, high winds, and volcanic eruptions on urban areas, there are the instruments of urban planning which are to be employed for both visualisation as well as development and implementation of strategy concepts for pre- and postdisaster intervention. The evolution of natural systems towards extreme conditions is taken into consideration so far at it concerns the damaging impact on urban areas and infrastructure and the impact on the natural environment of interventions to reduce such damaging impact.

  16. Research on response spectrum of dam based on scenario earthquake

    NASA Astrophysics Data System (ADS)

    Zhang, Xiaoliang; Zhang, Yushan

    2017-10-01

    Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.

  17. National Earthquake Hazards Reduction Program; time to expand

    USGS Publications Warehouse

    Steinbrugge, K.V.

    1990-01-01

    All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role? 

  18. Initiatives to Reduce Earthquake Risk of Developing Countries

    NASA Astrophysics Data System (ADS)

    Tucker, B. E.

    2008-12-01

    The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of

  19. Earthquakes in Arkansas and vicinity 1699-2010

    USGS Publications Warehouse

    Dart, Richard L.; Ausbrooks, Scott M.

    2011-01-01

    This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  20. P-12 Engineering Education Research and Practice

    ERIC Educational Resources Information Center

    Moore, Tamara; Richards, Larry G.

    2012-01-01

    This special issue of "Advances in Engineering Education" explores recent developments in P-12 Engineering Education. It includes papers devoted to research and practice, and reports some of the most exciting work in the field today. In our Call of Papers, we solicited two types of papers: Research papers and Practice papers. The former…

  1. Earthquakes in Mississippi and vicinity 1811-2010

    USGS Publications Warehouse

    Dart, Richard L.; Bograd, Michael B.E.

    2011-01-01

    This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.

  2. A before and after study on personality assessment in adolescents exposed to the 2009 earthquake in L'Aquila, Italy: influence of sports practice

    PubMed Central

    Vinciguerra, Maria Giulia; Masedu, Francesco; Tiberti, Sergio; Sconci, Vittorio

    2012-01-01

    Objective To assess and estimate the personality changes that occurred before and after the 2009 earthquake in L'Aquila and to model the ways that the earthquake affected adolescents according to gender and sport practice. The consequences of earthquakes on psychological health are long lasting for portions of the population, depending on age, gender, social conditions and individual experiences. Sports activities are considered a factor with which to test the overall earthquake impact on individual and social psychological changes in adolescents. Design Before and after design. Setting Population-based study conducted in L'Aquila, Italy, before and after the 2009 earthquake. Participants Before the earthquake, a random sample of 179 adolescent subjects who either practised or did not practise sports (71 vs 108, respectively). After the earthquake, of the original 179 subjects, 149 were assessed a second time. Primary outcome measure Minnesota Multiphasic Personality Inventory—Adolescents (MMPI-A) questionnaire scores, in a supervised environment. Results An unbalanced split plot design, at a 0.05 significance level, was carried out using a linear mixed model with quake, sex and sports practice as predictive factors. Although the overall scores indicated no deviant behaviours in the adolescents tested, changes were detected in many individual content scale scores, including depression (A-dep score mean ± SEM: before quake =47.54±0.73; after quake =52.67±0.86) and social discomfort (A-sod score mean ± SEM: before quake =49.91±0.65; after quake =51.72±0.81). The MMPI-A profiles show different impacts of the earthquake on adolescents according to gender and sport practice. Conclusions The differences detected in MMPI-A scores raise issues about social policies required to address the psychological changes in adolescents. The current study supports the idea that sport should be considered part of a coping strategy to assist adolescents in dealing with the

  3. Historical earthquake research in Austria

    NASA Astrophysics Data System (ADS)

    Hammerl, Christa

    2017-12-01

    Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.

  4. Leveling the Playing Field: Teacher Perception of Integrated STEM, Engineering, and Engineering Practices

    NASA Astrophysics Data System (ADS)

    Fincher, Bridgette Ann

    The purpose of this study was to describe the perceptions and approaches of 14 third-through-fifth grade Arkansan elementary teachers towards integrative engineering and engineering practices during 80 hours of integrated STEM professional development training in the summer and fall of 2014. This training was known as Project Flight. The purpose of the professional development was to learn integrated STEM content related to aviation and to write grade level curriculum units using Wiggins and McTighe's Understanding by Design curriculum framework. The current study builds upon on the original research. Using a mixed method exploratory, embedded QUAL[quan] case study design and a non-experimental convenience sample derived from original 20 participants of Project Flight, this research sought to answer the following question: Does professional development influence elementary teachers' perceptions of the curriculum and instruction of integrated STEM engineering and engineering practices in a 3-to-5 grade level setting? A series of six qualitative and one quantitative sub-questions informed the research of the mixed method question. Hermeneutic content analysis was applied to archival and current qualitative data sets while descriptive statistics, independent t-tests, and repeated measures ANOVA tests were performed on the quantitative data. Broad themes in the teachers' perceptions and understanding of the nature of integrated engineering and engineering practices emerged through triangulation. After the professional development and the teaching of the integrated STEM units, all 14 teachers sustained higher perceptions of personal self-efficacy in their understanding of Next Generation Science Standards (NGSS). The teachers gained understanding of engineering and engineering practices, excluding engineering habits of mind, throughout the professional development training and unit teaching. The research resulted in four major findings specific to elementary engineering

  5. Engineering Employment Characteristics. Engineering Education and Practice in the United States.

    ERIC Educational Resources Information Center

    National Academy of Sciences - National Research Council, Washington, DC. Commission on Engineering and Technical Systems.

    This panel report was prepared as part of the study of engineering education and practice conducted under the guidance of the National Research Council's Committee on the Education and Utilization of the Engineer. The panel's goal was to provide a data base that describes the engineering work force, its main activities, capabilities, and principal…

  6. Engineering and Language Discourse Collaboration: Practice Realities

    ERIC Educational Resources Information Center

    Harran, Marcelle

    2011-01-01

    This article describes a situated engineering project at a South African HE institution which is underpinned by collaboration between Applied Language Studies (DALS) and Mechanical Engineering. The collaboration requires language practitioners and engineering experts to negotiate and collaborate on academic literacies practices, discourse…

  7. The roles of engineering notebooks in shaping elementary engineering student discourse and practice

    NASA Astrophysics Data System (ADS)

    Hertel, Jonathan D.; Cunningham, Christine M.; Kelly, Gregory J.

    2017-06-01

    Engineering design challenges offer important opportunities for students to learn science and engineering knowledge and practices. This study examines how students' engineering notebooks across four units of the curriculum Engineering is Elementary (EiE) support student work during design challenges. Through educational ethnography and discourse analysis, transcripts of student talk and action were created and coded around the uses of notebooks in the accomplishment of engineering tasks. Our coding process identified two broad categories of roles of the notebooks: they scaffold student activity and support epistemic practices of engineering. The study showed the importance of prompts to engage students in effective uses of writing, the roles the notebook assumes in the students' small groups, and the ways design challenges motivate children to write and communicate.

  8. Relative Importance of Professional Practice and Engineering Management Competencies

    ERIC Educational Resources Information Center

    Pons, Dirk

    2016-01-01

    Problem: The professional practice of engineering always involves engineering management, but it is difficult to know what specifically to include in the undergraduate curriculum. Approach: The population of New Zealand practising engineers was surveyed to determine the importance they placed on specific professional practice and engineering…

  9. THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People

    NASA Astrophysics Data System (ADS)

    Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.

    2008-12-01

    Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake

  10. Sizing up earthquake damage: Differing points of view

    USGS Publications Warehouse

    Hough, S.; Bolen, A.

    2007-01-01

    When a catastrophic event strikes an urban area, many different professionals hit the ground running. Emergency responders respond, reporters report, and scientists and engineers collect and analyze data. Journalists and scientists may share interest in these events, but they have very different missions. To a journalist, earthquake damage is news. To a scientist or engineer, earthquake damage represents a valuable source of data that can help us understand how strongly the ground shook as well as how particular structures responded to the shaking.

  11. Earthquake: Game-based learning for 21st century STEM education

    NASA Astrophysics Data System (ADS)

    Perkins, Abigail Christine

    To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having

  12. The Roles of Engineering Notebooks in Shaping Elementary Engineering Student Discourse and Practice

    ERIC Educational Resources Information Center

    Hertel, Jonathan D.; Cunningham, Christine M.; Kelly, Gregory J.

    2017-01-01

    Engineering design challenges offer important opportunities for students to learn science and engineering knowledge and practices. This study examines how students' engineering notebooks across four units of the curriculum "Engineering is Elementary" (EiE) support student work during design challenges. Through educational ethnography and…

  13. Earthquake hazards: a national threat

    USGS Publications Warehouse

    ,

    2006-01-01

    Earthquakes are one of the most costly natural hazards faced by the Nation, posing a significant risk to 75 million Americans in 39 States. The risks that earthquakes pose to society, including death, injury, and economic loss, can be greatly reduced by (1) better planning, construction, and mitigation practices before earthquakes happen, and (2) providing critical and timely information to improve response after they occur. As part of the multi-agency National Earthquake Hazards Reduction Program, the U.S. Geological Survey (USGS) has the lead Federal responsibility to provide notification of earthquakes in order to enhance public safety and to reduce losses through effective forecasts based on the best possible scientific information.

  14. Practical guidelines to select and scale earthquake records for nonlinear response history analysis of structures

    USGS Publications Warehouse

    Kalkan, Erol; Chopra, Anil K.

    2010-01-01

    Earthquake engineering practice is increasingly using nonlinear response history analysis (RHA) to demonstrate performance of structures. This rigorous method of analysis requires selection and scaling of ground motions appropriate to design hazard levels. Presented herein is a modal-pushover-based scaling (MPS) method to scale ground motions for use in nonlinear RHA of buildings and bridges. In the MPS method, the ground motions are scaled to match (to a specified tolerance) a target value of the inelastic deformation of the first-'mode' inelastic single-degree-of-freedom (SDF) system whose properties are determined by first-'mode' pushover analysis. Appropriate for first-?mode? dominated structures, this approach is extended for structures with significant contributions of higher modes by considering elastic deformation of second-'mode' SDF system in selecting a subset of the scaled ground motions. Based on results presented for two bridges, covering single- and multi-span 'ordinary standard' bridge types, and six buildings, covering low-, mid-, and tall building types in California, the accuracy and efficiency of the MPS procedure are established and its superiority over the ASCE/SEI 7-05 scaling procedure is demonstrated.

  15. Adaptive Systems Engineering: A Medical Paradigm for Practicing Systems Engineering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    R. Douglas Hamelin; Ron D. Klingler; Christopher Dieckmann

    2011-06-01

    From its inception in the defense and aerospace industries, SE has applied holistic, interdisciplinary tools and work-process to improve the design and management of 'large, complex engineering projects.' The traditional scope of engineering in general embraces the design, development, production, and operation of physical systems, and SE, as originally conceived, falls within that scope. While this 'traditional' view has expanded over the years to embrace wider, more holistic applications, much of the literature and training currently available is still directed almost entirely at addressing the large, complex, NASA and defense-sized systems wherein the 'ideal' practice of SE provides the cradle-to-gravemore » foundation for system development and deployment. Under such scenarios, systems engineers are viewed as an integral part of the system and project life-cycle from conception to decommissioning. In far less 'ideal' applications, SE principles are equally applicable to a growing number of complex systems and projects that need to be 'rescued' from overwhelming challenges that threaten imminent failure. The medical profession provides a unique analogy for this latter concept and offers a useful paradigm for tailoring our 'practice' of SE to address the unexpected dynamics of applying SE in the real world. In short, we can be much more effective as systems engineers as we change some of the paradigms under which we teach and 'practice' SE.« less

  16. The effects of earthquake measurement concepts and magnitude anchoring on individuals' perceptions of earthquake risk

    USGS Publications Warehouse

    Celsi, R.; Wolfinbarger, M.; Wald, D.

    2005-01-01

    The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.

  17. The next new Madrid earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Atkinson, W.

    1988-01-01

    Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less

  18. Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2008-01-01

    contribution of building stock, its relative vulnerability, and distribution are vital components for determining the extent of casualties during an earthquake. It is evident from large deadly historical earthquakes that the distribution of vulnerable structures and their occupancy level during an earthquake control the severity of human losses. For example, though the number of strong earthquakes in California is comparable to that of Iran, the total earthquake-related casualties in California during the last 100 years are dramatically lower than the casualties from several individual Iranian earthquakes. The relatively low casualties count in California is attributed mainly to the fact that more than 90 percent of the building stock in California is made of wood and is designed to withstand moderate to large earthquakes (Kircher, Seligson and others, 2006). In contrast, the 80 percent adobe and or non-engineered masonry building stock with poor lateral load resisting systems in Iran succumbs even for moderate levels of ground shaking. Consequently, the heavy death toll for the 2003 Bam, Iran earthquake, which claimed 31,828 lives (Ghafory-Ashtiany and Mousavi, 2005), is directly attributable to such poorly resistant construction, and future events will produce comparable losses unless practices change. Similarly, multistory, precast-concrete framed buildings caused heavy casualties in the 1988 Spitak, Armenia earthquake (Bertero, 1989); weaker masonry and reinforced-concrete framed construction designed for gravity loads with soft first stories dominated losses in the Bhuj, India earthquake of 2001 (Madabhushi and Haigh, 2005); and adobe and weak masonry dwellings in Peru controlled the death toll in the Peru earthquake of 2007 (Taucer, J. and others, 2007). Spence (2007) after conducting a brief survey of most lethal earthquakes since 1960 found that building collapses remains a major cause of earthquake mortality and unreinforced masonry buildings are one of the mos

  19. Earthquake casualty models within the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike

    2011-01-01

    Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.

  20. Earthquake Education in Prime Time

    NASA Astrophysics Data System (ADS)

    de Groot, R.; Abbott, P.; Benthien, M.

    2004-12-01

    Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and

  1. 47 CFR 73.508 - Standards of good engineering practice.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 4 2010-10-01 2010-10-01 false Standards of good engineering practice. 73.508 Section 73.508 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... engineering practice. (a) All noncommercial educational stations and LPFM stations operating with more than 10...

  2. 47 CFR 73.508 - Standards of good engineering practice.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 4 2014-10-01 2014-10-01 false Standards of good engineering practice. 73.508 Section 73.508 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... engineering practice. (a) All noncommercial educational stations and LPFM stations operating with more than 10...

  3. 47 CFR 73.508 - Standards of good engineering practice.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 4 2012-10-01 2012-10-01 false Standards of good engineering practice. 73.508 Section 73.508 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... engineering practice. (a) All noncommercial educational stations and LPFM stations operating with more than 10...

  4. 47 CFR 73.508 - Standards of good engineering practice.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 4 2013-10-01 2013-10-01 false Standards of good engineering practice. 73.508 Section 73.508 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... engineering practice. (a) All noncommercial educational stations and LPFM stations operating with more than 10...

  5. 47 CFR 73.508 - Standards of good engineering practice.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 4 2011-10-01 2011-10-01 false Standards of good engineering practice. 73.508 Section 73.508 Telecommunication FEDERAL COMMUNICATIONS COMMISSION (CONTINUED) BROADCAST RADIO SERVICES... engineering practice. (a) All noncommercial educational stations and LPFM stations operating with more than 10...

  6. Is there a basis for preferring characteristic earthquakes over a Gutenberg–Richter distribution in probabilistic earthquake forecasting?

    USGS Publications Warehouse

    Parsons, Thomas E.; Geist, Eric L.

    2009-01-01

    The idea that faults rupture in repeated, characteristic earthquakes is central to most probabilistic earthquake forecasts. The concept is elegant in its simplicity, and if the same event has repeated itself multiple times in the past, we might anticipate the next. In practice however, assembling a fault-segmented characteristic earthquake rupture model can grow into a complex task laden with unquantified uncertainty. We weigh the evidence that supports characteristic earthquakes against a potentially simpler model made from extrapolation of a Gutenberg–Richter magnitude-frequency law to individual fault zones. We find that the Gutenberg–Richter model satisfies key data constraints used for earthquake forecasting equally well as a characteristic model. Therefore, judicious use of instrumental and historical earthquake catalogs enables large-earthquake-rate calculations with quantifiable uncertainty that should get at least equal weighting in probabilistic forecasting.

  7. 76 FR 11821 - Submission for OMB Review; Comment Request Survey of Principal Investigators on Earthquake...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-03

    ...: Survey of Principal Investigators on Earthquake Engineering Research Awards Made by the National Science... survey of Principal Investigators on NSF earthquake engineering research awards, including but not... NATIONAL SCIENCE FOUNDATION Submission for OMB Review; Comment Request Survey of Principal...

  8. Understanding Earthquake Hazard & Disaster in Himalaya - A Perspective on Earthquake Forecast in Himalayan Region of South Central Tibet

    NASA Astrophysics Data System (ADS)

    Shanker, D.; Paudyal, ,; Singh, H.

    2010-12-01

    It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were

  9. The Loma Prieta, California, earthquake of October 17, 1989 - Public response: Chapter B in The Loma Prieta, California, earthquake of October 17, 1989: Societal Response (Professional Paper 1553)

    USGS Publications Warehouse

    Bolton, Patricia A.

    1993-01-01

    Major earthquakes provide seismologists and engineers an opportunity to examine the performance of the Earth and the man-made structures in response to the forces of the quake. So, too, do they provide social scientists an opportunity to delve into human responses evoked by the ground shaking and its physical consequences. The findings from such research can serve to guide the development and application of programs and practices designed to reduce death, injury, property losses, and social disruption in subsequent earthquakes. This chapter contains findings from studies focused mainly on public response to the Loma Prieta earthquake; that is, on the behavior and perceptions of the general population rather than on the activities of specific organizations or on the impact on procedures or policies. A major feature of several of these studies is that the information was collected from the population throughout the Bay area, not just from persons in the most badly damaged communities or who had suffered the greatest losses. This wide range serves to provide comparisons of behavior for those most directly affected by the earthquake with others who were less directly affected by it but still had to consider it very “close to home.”

  10. Journal of the Chinese Institute of Engineers. Special Issue: Commemoration of Chi-Chi Earthquake (II)

    NASA Astrophysics Data System (ADS)

    2002-09-01

    Contents include the following: Deep Electromagnetic Images of Seismogenic Zone of the Chi-Chi (Taiwan) Earthquake; New Techniques for Stress-Forecasting Earthquakes; Aspects of Characteristics of Near-Fault Ground Motions of the 1999 Chi-Chi (Taiwan) Earthquake; Liquefaction Damage and Related Remediation in Wufeng after the Chi-Chi Earthquake; Fines Content Effects on Liquefaction Potential Evaluation for Sites Liquefied during Chi-Chi Earthquake 1999; Damage Investigation and Liquefaction Potential Analysis of Gravelly Soil; Dynamic Characteristics of Soils in Yuan-Lin Liquefaction Area; A Preliminary Study of Earthquake Building Damage and Life Loss Due to the Chi-Chi Earthquake; Statistical Analyses of Relation between Mortality and Building Type in the 1999 Chi-Chi Earthquake; Development of an After Earthquake Disaster Shelter Evaluation Model; Posttraumatic Stress Reactions in Children and Adolescents One Year after the 1999 Taiwan Chi-Chi Earthquake; Changes or Not is the Question: the Meaning of Posttraumatic Stress Reactions One Year after the Taiwan Chi-Chi Earthquake.

  11. Real-Time Earthquake Monitoring with Spatio-Temporal Fields

    NASA Astrophysics Data System (ADS)

    Whittier, J. C.; Nittel, S.; Subasinghe, I.

    2017-10-01

    With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.

  12. GN&C Engineering Best Practices for Human-Rated Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Lebsock, Kenneth; West, John

    2007-01-01

    The NASA Engineering and Safety Center (NESC) recently completed an in-depth assessment to identify a comprehensive set of engineering considerations for the Design, Development, Test and Evaluation (DDT&E) of safe and reliable human-rated spacecraft systems. Reliability subject matter experts, discipline experts, and systems engineering experts were brought together to synthesize the current "best practices" both at the spacecraft system and subsystems levels. The objective of this paper is to summarize, for the larger Community of Practice, the initial set of Guidance, Navigation and Control (GN&C) engineering Best Practices as identified by this NESC assessment process.

  13. GN&C Engineering Best Practices for Human-Rated Spacecraft System

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Lebsock, Kenneth; West, John

    2008-01-01

    The NASA Engineering and Safety Center (NESC) recently completed an in-depth assessment to identify a comprehensive set of engineering considerations for the Design, Development, Test and Evaluation (DDT&E) of safe and reliable human-rated spacecraft systems. Reliability subject matter experts, discipline experts, and systems engineering experts were brought together to synthesize the current "best practices" both at the spacecraft system and subsystems levels. The objective of this paper is to summarize, for the larger Community of Practice, the initial set of Guidance, Navigation and Control (GN&C) engineering Best Practices as identified by this NESC assessment process.

  14. GN&C Engineering Best Practices For Human-Rated Spacecraft Systems

    NASA Technical Reports Server (NTRS)

    Dennehy, Cornelius J.; Lebsock, Kenneth; West, John

    2007-01-01

    The NASA Engineering and Safety Center (NESC) recently completed an in-depth assessment to identify a comprehensive set of engineering considerations for the Design, Development, Test and Evaluation (DDT&E) of safe and reliable human-rated spacecraft systems. Reliability subject matter experts, discipline experts, and systems engineering experts were brought together to synthesize the current "best practices" both at the spacecraft system and subsystems levels. The objective of this paper is to summarize, for the larger Community of Practice, the initial set of Guidance, Navigation and Control (GN&C) engineering Best Practices as identified by this NESC assessment process.

  15. The Alaska earthquake, March 27, 1964: lessons and conclusions

    USGS Publications Warehouse

    Eckel, Edwin B.

    1970-01-01

    One of the greatest earthquakes of all time struck south-central Alaska on March 27, 1964. Strong motion lasted longer than for most recorded earthquakes, and more land surface was dislocated, vertically and horizontally, than by any known previous temblor. Never before were so many effects on earth processes and on the works of man available for study by scientists and engineers over so great an area. The seismic vibrations, which directly or indirectly caused most of the damage, were but surface manifestations of a great geologic event-the dislocation of a huge segment of the crust along a deeply buried fault whose nature and even exact location are still subjects for speculation. Not only was the land surface tilted by the great tectonic event beneath it, with resultant seismic sea waves that traversed the entire Pacific, but an enormous mass of land and sea floor moved several tens of feet horizontally toward the Gulf of Alaska. Downslope mass movements of rock, earth, and snow were initiated. Subaqueous slides along lake shores and seacoasts, near-horizontal movements of mobilized soil (“landspreading”), and giant translatory slides in sensitive clay did the most damage and provided the most new knowledge as to the origin, mechanics, and possible means of control or avoidance of such movements. The slopes of most of the deltas that slid in 1964, and that produced destructive local waves, are still as steep or steeper than they were before the earthquake and hence would be unstable or metastable in the event of another great earthquake. Rockslide avalanches provided new evidence that such masses may travel on cushions of compressed air, but a widely held theory that glaciers surge after an earthquake has not been substantiated. Innumerable ground fissures, many of them marked by copious emissions of water, caused much damage in towns and along transportation routes. Vibration also consolidated loose granular materials. In some coastal areas, local

  16. History of Modern Earthquake Hazard Mapping and Assessment in California Using a Deterministic or Scenario Approach

    NASA Astrophysics Data System (ADS)

    Mualchin, Lalliana

    2011-03-01

    results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.

  17. Software engineering standards and practices

    NASA Technical Reports Server (NTRS)

    Durachka, R. W.

    1981-01-01

    Guidelines are presented for the preparation of a software development plan. The various phases of a software development project are discussed throughout its life cycle including a general description of the software engineering standards and practices to be followed during each phase.

  18. Investigating Engineering Practice Is Valuable for Mathematics Learning

    ERIC Educational Resources Information Center

    Goold, Eileen

    2015-01-01

    While engineering mathematics curricula often prescribe a fixed body of mathematical knowledge, this study takes a different approach; second-year engineering students are additionally required to investigate and document an aspect of mathematics used in engineering practice. A qualitative approach is used to evaluate the impact that students'…

  19. The Lisbon earthquake and its aftershocks in European Enlightenment thinking and planning practice

    NASA Astrophysics Data System (ADS)

    de Meyer, Dirk

    2010-05-01

    From the Old Testament to Hollywood B-movies, from Sodom and Gomorrah to Los Angeles, the city is the topos of cataclysm. The first city-wide catastrophe that would have a major impact on modern European thinking was the earthquake and tsunami that, in 1755, reduced Lisbon to ruins, and killed about one tenth of its population. This paper will look at the contemporary representations of that catastrophe and at its impact on European Enlightenment thinking and urban planning practices. For Voltaire and Kant the Lisbon disaster had a major impact on the development of there philosophical and esthetical concepts. It strenghtend Rousseau in his anti-urban thinking. For many others it gave way to that modern concept that later will be coined by Paul Valéry in his opening sentence of La crise de l'esprit: "we civilizations now know that we are mortal." I will argue more in detail how, contrary to the rebuilding after the fire of London, where both proposed and realised plans hardly represented a radical new way of conceiving a city plan, the Lisbon reconstruction under the direction of the King's Prime Minister Pombal, can be understood as the start of urban planning as a modern practice — as opposed to earlier, architect-directed Renaissance and Baroque planning. On a smaller scale we will look at the implementation, in the aftermath of the earthquake. of new anti-seismic building techniques in Lisbon's new constructions.

  20. Validation of simulated earthquake ground motions based on evolution of intensity and frequency content

    USGS Publications Warehouse

    Rezaeian, Sanaz; Zhong, Peng; Hartzell, Stephen; Zareian, Farzin

    2015-01-01

    Simulated earthquake ground motions can be used in many recent engineering applications that require time series as input excitations. However, applicability and validation of simulations are subjects of debate in the seismological and engineering communities. We propose a validation methodology at the waveform level and directly based on characteristics that are expected to influence most structural and geotechnical response parameters. In particular, three time-dependent validation metrics are used to evaluate the evolving intensity, frequency, and bandwidth of a waveform. These validation metrics capture nonstationarities in intensity and frequency content of waveforms, making them ideal to address nonlinear response of structural systems. A two-component error vector is proposed to quantify the average and shape differences between these validation metrics for a simulated and recorded ground-motion pair. Because these metrics are directly related to the waveform characteristics, they provide easily interpretable feedback to seismologists for modifying their ground-motion simulation models. To further simplify the use and interpretation of these metrics for engineers, it is shown how six scalar key parameters, including duration, intensity, and predominant frequency, can be extracted from the validation metrics. The proposed validation methodology is a step forward in paving the road for utilization of simulated ground motions in engineering practice and is demonstrated using examples of recorded and simulated ground motions from the 1994 Northridge, California, earthquake.

  1. Unraveling earthquake stresses: Insights from dynamically triggered and induced earthquakes

    NASA Astrophysics Data System (ADS)

    Velasco, A. A.; Alfaro-Diaz, R. A.

    2017-12-01

    Induced seismicity, earthquakes caused by anthropogenic activity, has more than doubled in the last several years resulting from practices related to oil and gas production. Furthermore, large earthquakes have been shown to promote the triggering of other events within two fault lengths (static triggering), due to static stresses caused by physical movement along the fault, and also remotely from the passage of seismic waves (dynamic triggering). Thus, in order to understand the mechanisms for earthquake failure, we investigate regions where natural, induced, and dynamically triggered events occur, and specifically target Oklahoma. We first analyze data from EarthScope's USArray Transportable Array (TA) and local seismic networks implementing an optimized (STA/LTA) detector in order to develop local detection and earthquake catalogs. After we identify triggered events through statistical analysis, and perform a stress analysis to gain insight on the stress-states leading to triggered earthquake failure. We use our observations to determine the role of different transient stresses in contributing to natural and induced seismicity by comparing these stresses to regional stress orientation. We also delineate critically stressed regions of triggered seismicity that may indicate areas susceptible to earthquake hazards associated with sustained fluid injection in provinces of induced seismicity. Anthropogenic injection and extraction activity can alter the stress state and fluid flow within production basins. By analyzing the stress release of these ancient faults caused by dynamic stresses, we may be able to determine if fluids are solely responsible for increased seismic activity in induced regions.

  2. Earthquake alarm; operating the seismograph station at the University of California, Berkeley.

    USGS Publications Warehouse

    Stump, B.

    1980-01-01

    At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point. 

  3. Reconnaissance engineering geology of Sitka and vicinity, Alaska, with emphasis on evaluation of earthquake and other geologic hazards

    USGS Publications Warehouse

    Yehle, Lynn A.

    1974-01-01

    A program to study the engineering geology of most of the larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about Sitka and vicinity is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are subject to revision as further information becomes available. This report can provide broad geologic guidelines for planners and engineers during preparation of land-use plans. The use of this information should lead to minimizing future loss of life and property due to geologic hazards, especially during very large earthquakes. Landscape of Sitka and surrounding area is characterized by numerous islands and a narrow strip of gently rolling ground adjacent to rugged mountains; steep valleys and some fiords cut sharply into the mountains. A few valley floors are wide and flat and grade into moderate-sized deltas. Glaciers throughout southeastern Alaska and elsewhere became vastly enlarged during the Pleistocene Epoch. The Sitka area presumably was covered by ice several times; glaciers deeply eroded some valleys and removed fractured bedrock along some faults. The last major deglaciation occurred sometime before 10,000 years ago. Crustal rebound believed to be related to glacial melting caused land emergence at Sitka of at least 35 feet (10.7 m) relative to present sea level. Bedrock at Sitka and vicinity is composed mostly of bedded, hard, dense graywacke and some argillite. Beds strike predominantly northwest and are vertical or steeply dipping. Locally, bedded rocks are cut by dikes of fine-grained igneous rock. Host bedrock is of Jurassic and Cretaceous age. Eight types of surficial deposits of Quaternary age were recognized. Below altitudes of 3S feet (10.7 m), the dominant deposits are those of modern and elevated shores and deltas; at higher altitudes, widespread muskeg overlies a mantle of

  4. Correlations between ground motion and building damage. Engineering intensity scale applied to the San Fernando earthquake of February 1971

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hafen, D.; Kintzer, F.C.

    1977-11-01

    The correlation between ground motion and building damage was investigated for the San Fernando earthquake of 1971. A series of iso-intensity maps was compiled to summarize the ground motion in terms of the Blume Engineering Intensity Scale (EIS). This involved the analysis of ground motion records from 62 stations in the Los Angeles area. Damage information for low-rise buildings was obtained in the form of records of loans granted by the Small Business Administration to repair earthquake damage. High-rise damage evaluations were based on direct inquiry and building inspection. Damage factors (ratio of damage repair cost to building value) weremore » calculated and summarized on contour maps. A statistical study was then undertaken to determine relationships between ground motion and damage factor. Several parameters for ground motion were considered and evaluated by means of correlation coefficients.« less

  5. Advancing the practice of systems engineering at JPL

    NASA Technical Reports Server (NTRS)

    Jansma, Patti A.; Jones, Ross M.

    2006-01-01

    In FY 2004, JPL launched an initiative to improve the way it practices systems engineering. The Lab's senior management formed the Systems Engineering Advancement (SEA) Project in order to "significantly advance the practice and organizational capabilities of systems engineering at JPL on flight projects and ground support tasks." The scope of the SEA Project includes the systems engineering work performed in all three dimensions of a program, project, or task: 1. the full life-cycle, i.e., concept through end of operations 2. the full depth, i.e., Program, Project, System, Subsystem, Element (SE Levels 1 to 5) 3. the full technical scope, e.g., the flight, ground and launch systems, avionics, power, propulsion, telecommunications, thermal, etc. The initial focus of their efforts defined the following basic systems engineering functions at JPL: systems architecture, requirements management, interface definition, technical resource management, system design and analysis, system verification and validation, risk management, technical peer reviews, design process management and systems engineering task management, They also developed a list of highly valued personal behaviors of systems engineers, and are working to inculcate those behaviors into members of their systems engineering community. The SEA Project is developing products, services, and training to support managers and practitioners throughout the entire system lifecycle. As these are developed, each one needs to be systematically deployed. Hence, the SEA Project developed a deployment process that includes four aspects: infrastructure and operations, communication and outreach, education and training, and consulting support. In addition, the SEA Project has taken a proactive approach to organizational change management and customer relationship management - both concepts and approaches not usually invoked in an engineering environment. This paper'3 describes JPL's approach to advancing the practice of

  6. Problem Solving and Engineering Design, Introducing Bachelor Students to Engineering Practice at K. U. Leuven

    ERIC Educational Resources Information Center

    Heylen, Christel; Smet, Marc; Buelens, Hermans; Sloten, Jos Vander

    2007-01-01

    A present-day engineer has a large scientific knowledge; he is a team-player, eloquent communicator and life-long learner. At the Katholieke Universiteit Leuven, the course "Problem Solving and Engineering Design" introduces engineering students from the first semester onwards into real engineering practice and teamwork. Working in small…

  7. Proceedings of the Regional Seminar on Earthquake Engineering (13th) Held in Istanbul, Turkey on 14-24 September 1987.

    DTIC Science & Technology

    1987-09-01

    Geological Survey, MS977, Menlo Park , CA 94025, USA. , TURKISH NATIONAL COMMITTEE FOR EARTHQUAKE ENGINEERING THIRTEENTH REGIONAL SEMINALR ON EARTQUAKE...this case the conditional probability P(E/F1) will also depend in general on t . A simple example of a case of this type was developed by the present...These studies took Into cosideration all the available date eoncerning the dynamic characteristics of different type * of buildings. A first attempt was

  8. Expanded Guidance for NASA Systems Engineering. Volume 1: Systems Engineering Practices

    NASA Technical Reports Server (NTRS)

    Hirshorn, Steven R.

    2016-01-01

    This document is intended to provide general guidance and information on systems engineering that will be useful to the NASA community. It provides a generic description of Systems Engineering (SE) as it should be applied throughout NASA. A goal of the expanded guidance is to increase awareness and consistency across the Agency and advance the practice of SE. This guidance provides perspectives relevant to NASA and data particular to NASA. This expanded guidance should be used as a companion for implementing NPR 7123.1, Systems Engineering Processes and Requirements, the Rev 2 version of SP-6105, and the Center-specific handbooks and directives developed for implementing systems engineering at NASA. It provides a companion reference book for the various systems engineering-related training being offered under NASA's auspices.

  9. Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale

    NASA Astrophysics Data System (ADS)

    Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.

    2016-05-01

    The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.

  10. The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment

    NASA Astrophysics Data System (ADS)

    Coppari, S.; Di Pasquale, G.; Goretti, A.; Papa, F.; Papa, S.; Paoli, G.; Pizza, A. G.; Severino, M.

    2008-07-01

    The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspection teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide

  11. Introducing survival ethics into engineering education and practice.

    PubMed

    Verharen, C; Tharakan, J; Middendorf, G; Castro-Sitiriche, M; Kadoda, G

    2013-06-01

    Given the possibilities of synthetic biology, weapons of mass destruction and global climate change, humans may achieve the capacity globally to alter life. This crisis calls for an ethics that furnishes effective motives to take global action necessary for survival. We propose a research program for understanding why ethical principles change across time and culture. We also propose provisional motives and methods for reaching global consensus on engineering field ethics. Current interdisciplinary research in ethics, psychology, neuroscience and evolutionary theory grounds these proposals. Experimental ethics, the application of scientific principles to ethical studies, provides a model for developing policies to advance solutions. A growing literature proposes evolutionary explanations for moral development. Connecting these approaches necessitates an experimental or scientific ethics that deliberately examines theories of morality for reliability. To illustrate how such an approach works, we cover three areas. The first section analyzes cross-cultural ethical systems in light of evolutionary theory. While such research is in its early stages, its assumptions entail consequences for engineering education. The second section discusses Howard University and University of Puerto Rico/Mayagüez (UPRM) courses that bring ethicists together with scientists and engineers to unite ethical theory and practice. We include a syllabus for engineering and STEM (Science, Technology, Engineering and Mathematics) ethics courses and a checklist model for translating educational theory and practice into community action. The model is based on aviation, medicine and engineering practice. The third and concluding section illustrates Howard University and UPRM efforts to translate engineering educational theory into community action. Multidisciplinary teams of engineering students and instructors take their expertise from the classroom to global communities to examine further the

  12. Earthquake Risk Mitigation in the Tokyo Metropolitan area

    NASA Astrophysics Data System (ADS)

    Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.

    2010-12-01

    Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and

  13. GPS Technologies as a Tool to Detect the Pre-Earthquake Signals Associated with Strong Earthquakes

    NASA Astrophysics Data System (ADS)

    Pulinets, S. A.; Krankowski, A.; Hernandez-Pajares, M.; Liu, J. Y. G.; Hattori, K.; Davidenko, D.; Ouzounov, D.

    2015-12-01

    The existence of ionospheric anomalies before earthquakes is now widely accepted. These phenomena started to be considered by GPS community to mitigate the GPS signal degradation over the territories of the earthquake preparation. The question is still open if they could be useful for seismology and for short-term earthquake forecast. More than decade of intensive studies proved that ionospheric anomalies registered before earthquakes are initiated by processes in the boundary layer of atmosphere over earthquake preparation zone and are induced in the ionosphere by electromagnetic coupling through the Global Electric Circuit. Multiparameter approach based on the Lithosphere-Atmosphere-Ionosphere Coupling model demonstrated that earthquake forecast is possible only if we consider the final stage of earthquake preparation in the multidimensional space where every dimension is one from many precursors in ensemble, and they are synergistically connected. We demonstrate approaches developed in different countries (Russia, Taiwan, Japan, Spain, and Poland) within the framework of the ISSI and ESA projects) to identify the ionospheric precursors. They are also useful to determine the all three parameters necessary for the earthquake forecast: impending earthquake epicenter position, expectation time and magnitude. These parameters are calculated using different technologies of GPS signal processing: time series, correlation, spectral analysis, ionospheric tomography, wave propagation, etc. Obtained results from different teams demonstrate the high level of statistical significance and physical justification what gives us reason to suggest these methodologies for practical validation.

  14. Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.; Hearne, Mike

    2009-01-01

    We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.

  15. Practical Application of Sociology in Systems Engineering

    NASA Technical Reports Server (NTRS)

    Watson, Michael D.; Andrews, James G.; Eckley, Jeri Cassel; Culver, Michael L.

    2017-01-01

    Systems engineering involves both the integration of the system and the integration of the disciplines which develop and operate the system. Integrating the disciplines is a sociological effort to bring together different groups, who often have different terminology, to achieve a common goal, the system. The focus for the systems engineer is information flow through the organization, between the disciplines, to ensure the system is developed and operated will all relevant information informing system decisions. The practical application of the sociology in systems engineering brings in various organizational development concepts including the principles of planned renegotiation and the application of principles to address information barriers created by organizational culture. Concepts such as specification of ignorance, consistent terminology, opportunity structures, role-sets, and the reclama (reconsideration) process are all important sociological approaches that help address the organizational social structure (culture). In bringing the disciplines together, the systems engineer must also be wary of social ambivalence, social anomie, social dysfunction, and insider-outsider behavior. Unintended consequences can result when these social issues are present. These issues can occur when localized subcultures shift from the overarching organizational culture, or when the organizational culture prevents achievement of system goals. These sociological principles provide the systems engineer with key approaches to manage the information flow through the organization as the disciplines are integrated and share their information and provides key sociological barriers to information flow through the organization. This paper will discuss the practical application of sociological principles to systems engineering.

  16. Scientific, Engineering, and Financial Factors of the 1989 Human-Triggered Newcastle Earthquake in Australia

    NASA Astrophysics Data System (ADS)

    Klose, C. D.

    2006-12-01

    This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5earthquakes increased rapidly. An example of a human-triggered earthquake is the 1989 Newcastle event in Australia that was a result of almost 200 years of coal mining and water over-exploitation, respectively. This earthquake, an Mw=5.6 event, caused more than 3.5 billion U.S. dollars in damage (1989 value) and was responsible for Australia's first and only to date earthquake fatalities. It is therefore thought that, the Newcastle region tends to develop unsustainably if comparing economic growth due to mining and financial losses of triggered earthquakes. An hazard assessment, based on a geomechanical crust model, shows that only four deep coal mines were responsible for triggering this severe earthquake. A small-scale economic risk assessment identifies that the financial loss due to earthquake damage has reduced mining profits that have been re-invested in the Newcastle region for over two centuries beginning in 1801. Furthermore, large-scale economic risk assessment reveals that the financial loss is equivalent to 26% of the Australian Gross Domestic Product (GDP) growth in 1988/89. These costs account for 13% of the total costs of all natural disasters (e.g., flooding, drought, wild fires) and 94% of the costs of all

  17. Prospective testing of Coulomb short-term earthquake forecasts

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.

    2009-12-01

    Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of

  18. Does Curriculum Practical Training Affect Engineers' Workplace Outcomes? Evidence from an Engineer Survey in China

    ERIC Educational Resources Information Center

    Li, Jing; Zhang, Yu; Tsang, Mun; Li, Manli

    2015-01-01

    With the increasing attention to STEM (Science, Technology, Engineering, and Math), hands-on Curriculum Practical Training (CPT) has been expanding rapidly worldwide as a requirement of the undergraduate engineering education. In China, a typical CPT for undergraduate engineering students requires several weeks of hands-on training in the…

  19. Benefits of multidisciplinary collaboration for earthquake casualty estimation models: recent case studies

    NASA Astrophysics Data System (ADS)

    So, E.

    2010-12-01

    Earthquake casualty loss estimation, which depends primarily on building-specific casualty rates, has long suffered from a lack of cross-disciplinary collaboration in post-earthquake data gathering. An increase in our understanding of what contributes to casualties in earthquakes involve coordinated data-gathering efforts amongst disciplines; these are essential for improved global casualty estimation models. It is evident from examining past casualty loss models and reviewing field data collected from recent events, that generalized casualty rates cannot be applied globally for different building types, even within individual countries. For a particular structure type, regional and topographic building design effects, combined with variable material and workmanship quality all contribute to this multi-variant outcome. In addition, social factors affect building-specific casualty rates, including social status and education levels, and human behaviors in general, in that they modify egress and survivability rates. Without considering complex physical pathways, loss models purely based on historic casualty data, or even worse, rates derived from other countries, will be of very limited value. What’s more, as the world’s population, housing stock, and living and cultural environments change, methods of loss modeling must accommodate these variables, especially when considering casualties. To truly take advantage of observed earthquake losses, not only do damage surveys need better coordination of international and national reconnaissance teams, but these teams must integrate difference areas of expertise including engineering, public health and medicine. Research is needed to find methods to achieve consistent and practical ways of collecting and modeling casualties in earthquakes. International collaboration will also be necessary to transfer such expertise and resources to the communities in the cities which most need it. Coupling the theories and findings from

  20. Geoethics and decision science issues in Japan's disaster management system: case study in the 2011 Tohoku earthquake and tsunami

    NASA Astrophysics Data System (ADS)

    Sugimoto, Megumi

    2015-04-01

    The March 11, 2011 Tohoku earthquake and its tsunami killed 18,508 people, including the missing (National Police Agency report as of April 2014) and raise the Level 7 accident at TEPCO's Fukushima Dai-ichi nuclear power station in Japan. The problems revealed can be viewed as due to a combination of risk-management, risk-communication, and geoethics issues. Japan's preparations for earthquakes and tsunamis are based on the magnitude of the anticipated earthquake for each region. The government organization coordinating the estimation of anticipated earthquakes is the "Headquarters for Earthquake Research Promotion" (HERP), which is under the Ministry of Education, Culture, Sports, Science and Technology (MEXT). Japan's disaster mitigation system is depicted schematically as consisting of three layers: seismology, civil engineering, and disaster mitigation planning. This research explains students in geoscience should study geoethics as part of their education related Tohoku earthquake and the Level 7 accident at TEPCO's Fukushima Dai-ichi nuclear power station. Only when they become practicing professionals, they will be faced with real geoethical dilemmas. A crisis such as the 2011 earthquake, tsunami, and Fukushima Dai-ichi nuclear accident, will force many geoscientists to suddenly confront previously unanticipated geoethics and risk-communication issues. One hopes that previous training will help them to make appropriate decisions under stress. We name it "decision science".

  1. Connecting slow earthquakes to huge earthquakes.

    PubMed

    Obara, Kazushige; Kato, Aitaro

    2016-07-15

    Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.

  2. Extending Engineering Practice Research with Shared Qualitative Data

    ERIC Educational Resources Information Center

    Trevelyan, James

    2016-01-01

    Research on engineering practice is scarce and sharing of qualitative research data can reduce the effort required for an aspiring researcher to obtain enough data from engineering workplaces to draw generalizable conclusions, both qualitative and quantitative. This paper describes how a large shareable qualitative data set on engineering…

  3. The California Earthquake Advisory Plan: A history

    USGS Publications Warehouse

    Roeloffs, Evelyn A.; Goltz, James D.

    2017-01-01

    Since 1985, the California Office of Emergency Services (Cal OES) has issued advisory statements to local jurisdictions and the public following seismic activity that scientists on the California Earthquake Prediction Evaluation Council view as indicating elevated probability of a larger earthquake in the same area during the next several days. These advisory statements are motivated by statistical studies showing that about 5% of moderate earthquakes in California are followed by larger events within a 10-km, five-day space-time window (Jones, 1985; Agnew and Jones, 1991; Reasenberg and Jones, 1994). Cal OES issued four earthquake advisories from 1985 to 1989. In October, 1990, the California Earthquake Advisory Plan formalized this practice, and six Cal OES Advisories have been issued since then. This article describes that protocol’s scientific basis and evolution.

  4. Defeating Earthquakes

    NASA Astrophysics Data System (ADS)

    Stein, R. S.

    2012-12-01

    The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by

  5. The California Post-Earthquake Information Clearinghouse: A Plan to Learn From the Next Large California Earthquake

    NASA Astrophysics Data System (ADS)

    Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.

    2008-12-01

    In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.

  6. Relative importance of professional practice and engineering management competencies

    NASA Astrophysics Data System (ADS)

    Pons, Dirk

    2016-09-01

    Problem: The professional practice of engineering always involves engineering management, but it is difficult to know what specifically to include in the undergraduate curriculum. Approach: The population of New Zealand practising engineers was surveyed to determine the importance they placed on specific professional practice and engineering management competencies. Findings: Results show that communication and project planning were the two most important topics, followed by others as identified. The context in which practitioners use communication skills was found to be primarily with project management, with secondary contexts identified. The necessity for engineers to develop the ability to use multiple soft skills in an integrative manner is strongly supported by the data. Originality: This paper is one of only a few large-scale surveys of practising engineers to have explored the soft skill attributes. It makes a didactic contribution of providing a ranked list of topics which can be used for designing the curriculum and prioritising teaching effort, which has not previously been achieved. It yields the new insight that combinations of topics are sometimes more important than individual topics.

  7. Constructing engineers through practice: Gendered features of learning and identity development

    NASA Astrophysics Data System (ADS)

    Tonso, Karen L.

    How do women and men student engineers develop an engineering identity (a sense of belonging, or not), while practicing "actual" engineering? What are the influences of gender, learning and knowledge, relations of power, and conceptions of equality on cultural identity development? I studied these issues in reform-minded engineering design classes, courses organized around teaching students communications, teamwork, and practical engineering. Engineering-student cultural identity categories revealed a status hierarchy, predicated on meeting "academic" criteria for excellence, and the almost total exclusion of women. While working as an engineering colleague on five student teams (three first-year and two senior) and attending their design classes, I documented how cultural identities were made evident and constructed in students' practical engineering. Design projects promoted linking academic knowledge with real-world situations, sharing responsibilities and trusting colleagues, communicating engineering knowledge to technical and non-technical members of business communities, and addressing gaps in students' knowledge. With a curriculum analysis and survey of students' perceptions of the differences between design and conventional courses, I embedded the design classes in the wider campus and found that: (1) Engineering education conferred prestige, power, and well-paying jobs on students who performed "academic" engineering, while failing to adequately encourage "actual" engineering practices. High-status student engineers were the least likely to perform "actual" engineering in design teams. (2) Engineering education advanced an ideology that encouraged its practitioners to consider men's privilege and women's invisibility normal. By making "acting like men act" the standards to which engineering students must conform, women learned to put up with oppressive treatment. Women's accepting their own mistreatment and hiding their womanhood became a condition of

  8. The Global Earthquake Model and Disaster Risk Reduction

    NASA Astrophysics Data System (ADS)

    Smolka, A. J.

    2015-12-01

    Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all

  9. Practical internal combustion engine laser spark plug development

    NASA Astrophysics Data System (ADS)

    Myers, Michael J.; Myers, John D.; Guo, Baoping; Yang, Chengxin; Hardy, Christopher R.

    2007-09-01

    Fundamental studies on laser ignition have been performed by the US Department of Energy under ARES (Advanced Reciprocating Engines Systems) and by the California Energy Commission under ARICE (Advanced Reciprocating Internal Combustion Engine). These and other works have reported considerable increases in fuel efficiencies along with substantial reductions in green-house gas emissions when employing laser spark ignition. Practical commercial applications of this technology require low cost high peak power lasers. The lasers must be small, rugged and able to provide stable laser beam output operation under adverse mechanical and environmental conditions. New DPSS (Diode Pumped Solid State) lasers appear to meet these requirements. In this work we provide an evaluation of HESP (High Efficiency Side Pumped) DPSS laser design and performance with regard to its application as a practical laser spark plug for use in internal combustion engines.

  10. Stem cell applications and tissue engineering approaches in surgical practice.

    PubMed

    Khan, Wasim S; Malik, Atif A; Hardingham, Timothy E

    2009-04-01

    There has been an increasing interest in stem cell applications and tissue engineering approaches in surgical practice to deal with damaged or lost tissue. Although there have been developments in almost all surgical disciplines, the greatest advances are being made in orthopaedics, especially in bone repair. Significant hurdles however remain to be overcome before tissue engineering becomes more routinely used in surgical practice.

  11. Earthquakes trigger the loss of groundwater biodiversity

    NASA Astrophysics Data System (ADS)

    Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero

    2014-09-01

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  12. Earthquakes trigger the loss of groundwater biodiversity.

    PubMed

    Galassi, Diana M P; Lombardo, Paola; Fiasca, Barbara; Di Cioccio, Alessia; Di Lorenzo, Tiziana; Petitta, Marco; Di Carlo, Piero

    2014-09-03

    Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and "ecosystem engineers", we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.

  13. Automobile Engine Control Parameters Study : Volume 1. Summary and Status of Domestic Engine Control Practice

    DOT National Transportation Integrated Search

    1977-02-01

    This report contains the results of a study to evaluate automobile engine control parameters and their effects on vehicle fuel economy and emissions. Volume I presents detailed technical information on the engine control practices used by selected do...

  14. The Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) Internship Program

    NASA Astrophysics Data System (ADS)

    Perry, S.; Jordan, T.

    2006-12-01

    Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.

  15. Road Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey

  16. The HayWired Earthquake Scenario—Earthquake Hazards

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of

  17. Towards the Future "Earthquake" School in the Cloud: Near-real Time Earthquake Games Competition in Taiwan

    NASA Astrophysics Data System (ADS)

    Chen, K. H.; Liang, W. T.; Wu, Y. F.; Yen, E.

    2014-12-01

    To prevent the future threats of natural disaster, it is important to understand how the disaster happened, why lives were lost, and what lessons have been learned. By that, the attitude of society toward natural disaster can be transformed from training to learning. The citizen-seismologists-in-Taiwan project is designed to elevate the quality of earthquake science education by means of incorporating earthquake/tsunami stories and near-real time earthquake games competition into the traditional curricula in schools. Through pilot of courses and professional development workshops, we have worked closely with teachers from elementary, junior high, and senior high schools, to design workable teaching plans through a practical operation of seismic monitoring at home or school. We will introduce how the 9-years-old do P- and S-wave picking and measure seismic intensity through interactive learning platform, how do scientists and school teachers work together, and how do we create an environment to facilitate continuous learning (i.e., near-real time earthquake games competition), to make earthquake science fun.

  18. Best management practices for soft engineering of shoreline

    USGS Publications Warehouse

    Caulk, Andrew D.; Gannon, John E.; Shaw, John R.; Hartig, John H.

    2000-01-01

    Historically, many river shorelines were stabilized and hardened with concrete and steel to protect developments from flooding and erosion, or to accommodate commercial navigation or industry. Typically shorelines were developed for a single purpose. Today, there is growing interest in developing shorelines for multiple purposes so that additional benefits can be accrued. Soft engineering is the use of ecological principles and practices to reduce erosion and achieve the stabilization and safety of shorelines, while enhancing habitat, improving aesthetics, and saving money. The purpose of this best management practices manual is to provide insights and technical advice to local governments, developers, planners, consultants, and industries on when, where, why, and how to incorporate soft engineering of shorelines into shoreline redevelopment projects and reap subsequent benefits. More specific technical advice and contact information can be found in the soft engineering case studies presented in this manual.

  19. Ten Engineers Reading: Disjunctions between Preference and Practice in Civil Engineering Faculty Responses

    ERIC Educational Resources Information Center

    Taylor, Summer Smith; Patton, Martha D.

    2006-01-01

    Previous research has indicated that engineering faculty do not follow best practices when commenting on students' technical writing. However, it is unclear whether the faculty prefer to comment in these ineffective ways, or whether they prefer more effective practices but simply do not enact them. This study adapts a well known study of response…

  20. Earthquake Preparedness: What Every Childcare Provider Should Know.

    ERIC Educational Resources Information Center

    California State Office of Emergency Services, Sacramento.

    This brochure provides information to help child care providers reduce or avoid damage, injuries, or loss of life during earthquakes. It first discusses steps to implement before an earthquake strikes, including securing household contents, and practicing with children how to duck and cover. Next, the brochure describes what to do during an…

  1. Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT): Towards the Next Generation of Internship

    NASA Astrophysics Data System (ADS)

    Perry, S.; Benthien, M.; Jordan, T. H.

    2005-12-01

    The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.

  2. The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coppari, S.; Di Pasquale, G.; Goretti, A.

    2008-07-08

    The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspectionmore » teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through

  3. 33 CFR 222.4 - Reporting earthquake effects.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... structural integrity and operational adequacy of major Civil Works structures following the occurrence of...) Applicability. This regulation is applicable to all field operating agencies having Civil Works responsibilities...

  4. 2001 Bhuj, India, earthquake engineering seismoscope recordings and Eastern North America ground-motion attenuation relations

    USGS Publications Warehouse

    Cramer, C.H.; Kumar, A.

    2003-01-01

    Engineering seismoscope data collected at distances less than 300 km for the M 7.7 Bhuj, India, mainshock are compatible with ground-motion attenuation in eastern North America (ENA). The mainshock ground-motion data have been corrected to a common geological site condition using the factors of Joyner and Boore (2000) and a classification scheme of Quaternary or Tertiary sediments or rock. We then compare these data to ENA ground-motion attenuation relations. Despite uncertainties in recording method, geological site corrections, common tectonic setting, and the amount of regional seismic attenuation, the corrected Bhuj dataset agrees with the collective predictions by ENA ground-motion attenuation relations within a factor of 2. This level of agreement is within the dataset uncertainties and the normal variance for recorded earthquake ground motions.

  5. Earthquakes

    MedlinePlus

    ... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...

  6. Development of an Earthquake Impact Scale

    NASA Astrophysics Data System (ADS)

    Wald, D. J.; Marano, K. D.; Jaiswal, K. S.

    2009-12-01

    With the advent of the USGS Prompt Assessment of Global Earthquakes for Response (PAGER) system, domestic (U.S.) and international earthquake responders are reconsidering their automatic alert and activation levels as well as their response procedures. To help facilitate rapid and proportionate earthquake response, we propose and describe an Earthquake Impact Scale (EIS) founded on two alerting criteria. One, based on the estimated cost of damage, is most suitable for domestic events; the other, based on estimated ranges of fatalities, is more appropriate for most global events. Simple thresholds, derived from the systematic analysis of past earthquake impact and response levels, turn out to be quite effective in communicating predicted impact and response level of an event, characterized by alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (major disaster, necessitating international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses exceeding 1M, 10M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness dominate in countries where vernacular building practices typically lend themselves to high collapse and casualty rates, and it is these impacts that set prioritization for international response. In contrast, it is often financial and overall societal impacts that trigger the level of response in regions or countries where prevalent earthquake resistant construction practices greatly reduce building collapse and associated fatalities. Any newly devised alert protocols, whether financial or casualty based, must be intuitive and consistent with established lexicons and procedures. In this analysis, we make an attempt

  7. A new reference global instrumental earthquake catalogue (1900-2009)

    NASA Astrophysics Data System (ADS)

    Di Giacomo, D.; Engdahl, B.; Bondar, I.; Storchak, D. A.; Villasenor, A.; Bormann, P.; Lee, W.; Dando, B.; Harris, J.

    2011-12-01

    For seismic hazard studies on a global and/or regional scale, accurate knowledge of the spatial distribution of seismicity, the magnitude-frequency relation and the maximum magnitudes is of fundamental importance. However, such information is normally not homogeneous (or not available) for the various seismically active regions of the Earth. To achieve the GEM objectives (www.globalquakemodel.org) of calculating and communicating earthquake risk worldwide, an improved reference global instrumental catalogue for large earthquakes spanning the entire 100+ years period of instrumental seismology is an absolute necessity. To accomplish this task, we apply the most up-to-date techniques and standard observatory practices for computing the earthquake location and magnitude. In particular, the re-location procedure benefits both from the depth determination according to Engdahl and Villaseñor (2002), and the advanced technique recently implemented at the ISC (Bondár and Storchak, 2011) to account for correlated error structure. With regard to magnitude, starting from the re-located hypocenters, the classical surface and body-wave magnitudes are determined following the new IASPEI standards and by using amplitude-period data of phases collected from historical station bulletins (up to 1970), which were not available in digital format before the beginning of this work. Finally, the catalogue will provide moment magnitude values (including uncertainty) for each seismic event via seismic moment, via surface wave magnitude or via other magnitude types using empirical relationships. References Engdahl, E.R., and A. Villaseñor (2002). Global seismicity: 1900-1999. In: International Handbook of Earthquake and Engineering Seismology, eds. W.H.K. Lee, H. Kanamori, J.C. Jennings, and C. Kisslinger, Part A, 665-690, Academic Press, San Diego. Bondár, I., and D. Storchak (2011). Improved location procedures at the International Seismological Centre, Geophys. J. Int., doi:10.1111/j

  8. Scientific and non-scientific challenges for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Marzocchi, W.

    2015-12-01

    Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.

  9. Housing Damage Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.

  10. The shear-lag effect of thin-walled box girder under vertical earthquake excitation

    NASA Astrophysics Data System (ADS)

    Zhai, Zhipeng; Li, Yaozhuang; Guo, Wei

    2017-03-01

    The variation method based on the energy variation principle is proved to be accurate and valid for analyzing the shear lag effect of box girder under static and dynamic load. Meanwhile, dynamic problems gradually become the key factors in engineering practice. Therefore, a method for calculating the shear lag effect in thin-walled box girder under vertical seismic excitation is proposed by applying Hamilton Principle in this paper. The Timoshenko shear deformation is taken into account. And a new definition of shear lag ratio for box girder is given. What's more, some conclusions are drawn by analysis of numerical example. The results show that small amplitude of earthquake ground motion can generate high stress and obvious shear lag, especially in the region of resonance. And the influence of rotary inertia cannot be ignored for analyzing the shear lag effect. With the increase of span to width ratio, shear lag effect becomes smaller and smaller. These research conclusions will be useful for the engineering practice and enrich the theoretical studies of box girders.

  11. Practical Education Support to Foster Engineers at Manufacturing and Engineering Design Center in Muroran Institute of Technology

    NASA Astrophysics Data System (ADS)

    Kazama, Toshiharu; Hanajima, Naohiko; Shimizu, Kazumichi; Satoh, Kohki

    To foster engineers with creative power, Muroran Institute of Technology established Manufacturing and Engineering Design Center (MEDeC) that concentrates on Monozukuri. MEDeC consists of three project groups : i) Education Support Group provides educational support for practical training classes on and off campus and PDCA (plan-do-check-action) -conscious engineering design education related to Monozukuri ; ii) Fundamental Manufacturing Research Group carries out nurture research into fundamental and innovative technology of machining and manufacturing, and iii) Regional Cooperation Group coordinates the activities in cooperation with bureau, schools and industries in and around Muroran City. MEDeC has a fully integrated collection of machine tools and hand tools for manufacturing, an atelier, a tatara workplace, implements for measurement and related equipment designed for practically teaching state-of-the-practice manufacturing methods.

  12. 8 March 2010 Elazığ-Kovancilar (Turkey) Earthquake: observations on ground motions and building damage

    USGS Publications Warehouse

    Akkar, Sinan; Aldemir, A.; Askan, A.; Bakir, S.; Canbay, E.; Demirel, I.O.; Erberik, M.A.; Gulerce, Z.; Gulkan, Polat; Kalkan, Erol; Prakash, S.; Sandikkaya, M.A.; Sevilgen, V.; Ugurhan, B.; Yenier, E.

    2011-01-01

    An earthquake of MW = 6.1 occurred in the Elazığ region of eastern Turkey on 8 March 2010 at 02:32:34 UTC. The United States Geological Survey (USGS) reported the epicenter of the earthquake as 38.873°N-39.981°E with a focal depth of 12 km. Forty-two people lost their lives and 137 were injured during the event. The earthquake was reported to be on the left-lateral strike-slip east Anatolian fault (EAF), which is one of the two major active fault systems in Turkey. Teams from the Earthquake Engineering Research Center of the Middle East Technical University (EERC-METU) visited the earthquake area in the aftermath of the mainshock. Their reconnaissance observations were combined with interpretations of recorded ground motions for completeness. This article summarizes observations on building and ground damage in the area and provides a discussion of the recorded motions. No significant observations in terms of geotechnical engineering were made.

  13. Earthquakes.

    ERIC Educational Resources Information Center

    Walter, Edward J.

    1977-01-01

    Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)

  14. Reconnaissance engineering geology of the Metlakatla area, Annette Island, Alaska, with emphasis on evaluation of earthquakes and other geologic hazards

    USGS Publications Warehouse

    Yehle, Lynn A.

    1977-01-01

    A program to study the engineering geology of most larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about the Metlakatla area, Annette Island, is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are tentative. Landscape of the Metlakatla Peninsula, on which the city of Metlakatla is located, is characterized by a muskeg-covered terrane of very low relief. In contrast, most of the rest of Annette Island is composed of mountainous terrane with steep valleys and numerous lakes. During the Pleistocene Epoch the Metlakatla area was presumably covered by ice several times; glaciers smoothed the present Metlakatla Peninsula and deeply eroded valleys on the rest. of Annette Island. The last major deglaciation was completed probably before 10,000 years ago. Rebound of the earth's crust, believed to be related to glacial melting, has caused land emergence at Metlakatla of at least 50 ft (15 m) and probably more than 200 ft (61 m) relative to present sea level. Bedrock in the Metlakatla area is composed chiefly of hard metamorphic rocks: greenschist and greenstone with minor hornfels and schist. Strike and dip of beds are generally variable and minor offsets are common. Bedrock is of late Paleozoic to early Mesozoic age. Six types of surficial geologic materials of Quaternary age were recognized: firm diamicton, emerged shore, modern shore and delta, and alluvial deposits, very soft muskeg and other organic deposits, and firm to soft artificial fill. A combination map unit is composed of bedrock or diamicton. Geologic structure in southeastern Alaska is complex because, since at least early Paleozoic time, there have been several cycles of tectonic deformation that affected different parts of the region. Southeastern Alaska is transected by numerous faults and possible faults that attest to major

  15. The HayWired Earthquake Scenario

    USGS Publications Warehouse

    Detweiler, Shane T.; Wein, Anne M.

    2017-04-24

    ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the

  16. Earthquakes.

    ERIC Educational Resources Information Center

    Pakiser, Louis C.

    One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…

  17. How an Integrative STEM Curriculum Can Benefit Students in Engineering Design Practices

    ERIC Educational Resources Information Center

    Fan, Szu-Chun; Yu, Kuang-Chao

    2017-01-01

    STEM-oriented engineering design practice has become recognized increasingly by technology education professionals in Taiwan. This study sought to examine the effectiveness of the application of an integrative STEM approach within engineering design practices in high school technology education in Taiwan. A quasi-experimental study was conducted…

  18. Engineering applications of strong ground motion simulation

    NASA Astrophysics Data System (ADS)

    Somerville, Paul

    1993-02-01

    The formulation, validation and application of a procedure for simulating strong ground motions for use in engineering practice are described. The procedure uses empirical source functions (derived from near-source strong motion recordings of small earthquakes) to provide a realistic representation of effects such as source radiation that are difficult to model at high frequencies due to their partly stochastic behavior. Wave propagation effects are modeled using simplified Green's functions that are designed to transfer empirical source functions from their recording sites to those required for use in simulations at a specific site. The procedure has been validated against strong motion recordings of both crustal and subduction earthquakes. For the validation process we choose earthquakes whose source models (including a spatially heterogeneous distribution of the slip of the fault) are independently known and which have abundant strong motion recordings. A quantitative measurement of the fit between the simulated and recorded motion in this validation process is used to estimate the modeling and random uncertainty associated with the simulation procedure. This modeling and random uncertainty is one part of the overall uncertainty in estimates of ground motions of future earthquakes at a specific site derived using the simulation procedure. The other contribution to uncertainty is that due to uncertainty in the source parameters of future earthquakes that affect the site, which is estimated from a suite of simulations generated by varying the source parameters over their ranges of uncertainty. In this paper, we describe the validation of the simulation procedure for crustal earthquakes against strong motion recordings of the 1989 Loma Prieta, California, earthquake, and for subduction earthquakes against the 1985 Michoacán, Mexico, and Valparaiso, Chile, earthquakes. We then show examples of the application of the simulation procedure to the estimatation of the

  19. Engineering uses of physics-based ground motion simulations

    USGS Publications Warehouse

    Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.

    2014-01-01

    This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.

  20. Modeling as an Engineering Habit of Mind and Practice

    ERIC Educational Resources Information Center

    Lammi, Matthew D.; Denson, Cameron D.

    2017-01-01

    In this paper we examine a case study of a pedagogical strategy that focuses on the teaching of modeling as a habit of mind and practice for novice designers engaged in engineering design challenges. In an engineering design course, pre-service teachers created modeling artifacts in the form of conceptual models, graphical models, mathematical…

  1. Geotechnical effects of the 2015 magnitude 7.8 Gorkha, Nepal, earthquake and aftershocks

    USGS Publications Warehouse

    Moss, Robb E. S.; Thompson, Eric M.; Kieffer, D Scott; Tiwari, Binod; Hashash, Youssef M A; Acharya, Indra; Adhikari, Basanta; Asimaki, Domniki; Clahan, Kevin B.; Collins, Brian D.; Dahal, Sachindra; Jibson, Randall W.; Khadka, Diwakar; Macdonald, Amy; Madugo, Chris L M; Mason, H Benjamin; Pehlivan, Menzer; Rayamajhi, Deepak; Uprety, Sital

    2015-01-01

    This article summarizes the geotechnical effects of the 25 April 2015 M 7.8 Gorkha, Nepal, earthquake and aftershocks, as documented by a reconnaissance team that undertook a broad engineering and scientific assessment of the damage and collected perishable data for future analysis. Brief descriptions are provided of ground shaking, surface fault rupture, landsliding, soil failure, and infrastructure performance. The goal of this reconnaissance effort, led by Geotechnical Extreme Events Reconnaissance, is to learn from earthquakes and mitigate hazards in future earthquakes.

  2. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  3. Deviant Earthquakes: Data-driven Constraints on the Variability in Earthquake Source Properties and Seismic Hazard

    NASA Astrophysics Data System (ADS)

    Trugman, Daniel Taylor

    The complexity of the earthquake rupture process makes earthquakes inherently unpredictable. Seismic hazard forecasts often presume that the rate of earthquake occurrence can be adequately modeled as a space-time homogenenous or stationary Poisson process and that the relation between the dynamical source properties of small and large earthquakes obey self-similar scaling relations. While these simplified models provide useful approximations and encapsulate the first-order statistical features of the historical seismic record, they are inconsistent with the complexity underlying earthquake occurrence and can lead to misleading assessments of seismic hazard when applied in practice. The six principle chapters of this thesis explore the extent to which the behavior of real earthquakes deviates from these simplified models, and the implications that the observed deviations have for our understanding of earthquake rupture processes and seismic hazard. Chapter 1 provides a brief thematic overview and introduction to the scope of this thesis. Chapter 2 examines the complexity of the 2010 M7.2 El Mayor-Cucapah earthquake, focusing on the relation between its unexpected and unprecedented occurrence and anthropogenic stresses from the nearby Cerro Prieto Geothermal Field. Chapter 3 compares long-term changes in seismicity within California's three largest geothermal fields in an effort to characterize the relative influence of natural and anthropogenic stress transients on local seismic hazard. Chapter 4 describes a hybrid, hierarchical clustering algorithm that can be used to relocate earthquakes using waveform cross-correlation, and applies the new algorithm to study the spatiotemporal evolution of two recent seismic swarms in western Nevada. Chapter 5 describes a new spectral decomposition technique that can be used to analyze the dynamic source properties of large datasets of earthquakes, and applies this approach to revisit the question of self-similar scaling of

  4. Engineering models for catastrophe risk and their application to insurance

    NASA Astrophysics Data System (ADS)

    Dong, Weimin

    2002-06-01

    Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.

  5. Surface Rupture Effects on Earthquake Moment-Area Scaling Relations

    NASA Astrophysics Data System (ADS)

    Luo, Yingdi; Ampuero, Jean-Paul; Miyakoshi, Ken; Irikura, Kojiro

    2017-09-01

    Empirical earthquake scaling relations play a central role in fundamental studies of earthquake physics and in current practice of earthquake hazard assessment, and are being refined by advances in earthquake source analysis. A scaling relation between seismic moment ( M 0) and rupture area ( A) currently in use for ground motion prediction in Japan features a transition regime of the form M 0- A 2, between the well-recognized small (self-similar) and very large (W-model) earthquake regimes, which has counter-intuitive attributes and uncertain theoretical underpinnings. Here, we investigate the mechanical origin of this transition regime via earthquake cycle simulations, analytical dislocation models and numerical crack models on strike-slip faults. We find that, even if stress drop is assumed constant, the properties of the transition regime are controlled by surface rupture effects, comprising an effective rupture elongation along-dip due to a mirror effect and systematic changes of the shape factor relating slip to stress drop. Based on this physical insight, we propose a simplified formula to account for these effects in M 0- A scaling relations for strike-slip earthquakes.

  6. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China

    PubMed Central

    Li, Shasha; Zhai, Guofang; Zhou, Shutian; Fan, Chenjing; Wu, Yunqing; Ren, Chongqiang

    2017-01-01

    Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual’s earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded. PMID:28272359

  7. Insight into the Earthquake Risk Information Seeking Behavior of the Victims: Evidence from Songyuan, China.

    PubMed

    Li, Shasha; Zhai, Guofang; Zhou, Shutian; Fan, Chenjing; Wu, Yunqing; Ren, Chongqiang

    2017-03-07

    Efficient risk communication is a vital way to reduce the vulnerability of individuals when facing emergency risks, especially regarding earthquakes. Efficient risk communication aims at improving the supply of risk information and fulfilling the need for risk information by individuals. Therefore, an investigation into individual-level information seeking behavior within earthquake risk contexts is very important for improved earthquake risk communication. However, at present there are very few studies that have explored the behavior of individuals seeking earthquake risk information. Under the guidance of the Risk Information Seeking and Processing model as well as relevant practical findings using the structural equation model, this study attempts to explore the main determinants of an individual's earthquake risk information seeking behavior, and to validate the mediator effect of information need during the seeking process. A questionnaire-based survey of 918 valid respondents in Songyuan, China, who had been hit by a small earthquake swarm, was used to provide practical evidence for this study. Results indicated that information need played a noteworthy role in the earthquake risk information seeking process, and was detected both as an immediate predictor and as a mediator. Informational subjective norms drive the seeking behavior on earthquake risk information through both direct and indirect approaches. Perceived information gathering capacity, negative affective responses and risk perception have an indirect effect on earthquake risk information seeking behavior via information need. The implications for theory and practice regarding risk communication are discussed and concluded.

  8. Algorithm Engineering: Concepts and Practice

    NASA Astrophysics Data System (ADS)

    Chimani, Markus; Klein, Karsten

    Over the last years the term algorithm engineering has become wide spread synonym for experimental evaluation in the context of algorithm development. Yet it implies even more. We discuss the major weaknesses of traditional "pen and paper" algorithmics and the ever-growing gap between theory and practice in the context of modern computer hardware and real-world problem instances. We present the key ideas and concepts of the central algorithm engineering cycle that is based on a full feedback loop: It starts with the design of the algorithm, followed by the analysis, implementation, and experimental evaluation. The results of the latter can then be reused for modifications to the algorithmic design, stronger or input-specific theoretic performance guarantees, etc. We describe the individual steps of the cycle, explaining the rationale behind them and giving examples of how to conduct these steps thoughtfully. Thereby we give an introduction to current algorithmic key issues like I/O-efficient or parallel algorithms, succinct data structures, hardware-aware implementations, and others. We conclude with two especially insightful success stories—shortest path problems and text search—where the application of algorithm engineering techniques led to tremendous performance improvements compared with previous state-of-the-art approaches.

  9. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better

  10. A global building inventory for earthquake loss estimation and risk management

    USGS Publications Warehouse

    Jaiswal, K.; Wald, D.; Porter, K.

    2010-01-01

    We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.

  11. Jumping over the hurdles to effectively communicate the Operational Earthquake Forecast

    NASA Astrophysics Data System (ADS)

    McBride, S.; Wein, A. M.; Becker, J.; Potter, S.; Tilley, E. N.; Gerstenberger, M.; Orchiston, C.; Johnston, D. M.

    2016-12-01

    Probabilities, uncertainties, statistics, science, and threats are notoriously difficult topics to communicate with members of the public. The Operational Earthquake Forecast (OEF) is designed to provide an understanding of potential numbers and sizes of earthquakes and the communication of it must address all of those challenges. Furthermore, there are other barriers to effective communication of the OEF. These barriers include the erosion of trust in scientists and experts, oversaturation of messages, fear and threat messages magnified by the sensalisation of the media, fractured media environments and online echo chambers. Given the complexities and challenges of the OEF, how can we overcome barriers to effective communication? Crisis and risk communication research can inform the development of communication strategies to increase the public understanding and use of the OEF, when applied to the opportunities and challenges of practice. We explore ongoing research regarding how the OEF can be more effectively communicated - including the channels, tools and message composition to engage with a variety of publics. We also draw on past experience and a study of OEF communication during the Canterbury Earthquake Sequence (CES). We demonstrate how research and experience has guided OEF communications during subsequent events in New Zealand, including the M5.7 Valentine's Day earthquake in 2016 (CES), M6.0 Wilberforce earthquake in 2015, and the Cook Strait/Lake Grassmere earthquakes in 2013. We identify the successes and lessons learned of the practical communication of the OEF. Finally, we present future projects and directions in the communication of OEF, informed by both practice and research.

  12. Charles Darwin's earthquake reports

    NASA Astrophysics Data System (ADS)

    Galiev, Shamil

    2010-05-01

    problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.

  13. Increased Earthquake Rates in the Central and Eastern US Portend Higher Earthquake Hazards

    NASA Astrophysics Data System (ADS)

    Llenos, A. L.; Rubinstein, J. L.; Ellsworth, W. L.; Mueller, C. S.; Michael, A. J.; McGarr, A.; Petersen, M. D.; Weingarten, M.; Holland, A. A.

    2014-12-01

    Since 2009 the central and eastern United States has experienced an unprecedented increase in the rate of M≥3 earthquakes that is unlikely to be due to natural variation. Where the rates have increased so has the seismic hazard, making it important to understand these changes. Areas with significant seismicity increases are limited to areas where oil and gas production take place. By far the largest contributor to the seismicity increase is Oklahoma, where recent studies suggest that these rate changes may be due to fluid injection (e.g., Keranen et al., Geology, 2013; Science, 2014). Moreover, the area of increased seismicity in northern Oklahoma that began in 2013 coincides with the Mississippi Lime play, where well completions greatly increased the year before the seismicity increase. This suggests a link to oil and gas production either directly or from the disposal of significant amounts of produced water within the play. For the purpose of assessing the hazard due to these earthquakes, should they be treated differently from natural earthquakes? Previous studies suggest that induced seismicity may differ from natural seismicity in clustering characteristics or frequency-magnitude distributions (e.g., Bachmann et al., GJI, 2011; Llenos and Michael, BSSA, 2013). These differences could affect time-independent hazard computations, which typically assume that clustering and size distribution remain constant. In Oklahoma, as well as other areas of suspected induced seismicity, we find that earthquakes since 2009 tend to be considerably more clustered in space and time than before 2009. However differences between various regional and national catalogs leave unclear whether there are significant changes in magnitude distribution. Whether they are due to natural or industrial causes, the increased earthquake rates in these areas could increase the hazard in ways that are not accounted for in current hazard assessment practice. Clearly the possibility of induced

  14. Earthquakes

    MedlinePlus

    An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...

  15. Transient Seepage Analyses in Levee Engineering Practice

    DTIC Science & Technology

    2016-07-01

    and contractors in conventional engineering practice has outpaced the development of guidance documents and design recommendations. The major...ERDC TR-16-8 99 B.5 Final solution The final solution is obtained by first solving for ht from Equation B.5 as follows: t t tssˆh h h  (B

  16. Exploration and practice for engineering innovative talents training based on project-driven

    NASA Astrophysics Data System (ADS)

    Xu, Yishen; Lv, Qingsong; Ye, Yan; Wu, Maocheng; Gu, Jihua

    2017-08-01

    As one of the "excellent engineer education program" of the Ministry of Education and one of the characteristic majors of Jiangsu Province, the major of optoelectronic information science and engineering in Soochow University has a long history and distinctive features. In recent years, aiming to the talents training objective of "broad foundation, practiceoriented, to be creative", education and teaching reforms have been carried out, which emphasize basis of theoretical teaching, carrier of practical training, promotion of projects and discussion, and development of second class. By optimizing the teaching contents and course system of the theoretical courses, the engineering innovative talents training mode based on the project-driven has been implemented with playing a practical training carrier role and overall managing the second class teaching for cultivating students' innovative spirit and practical ability. Meanwhile, the evaluation mechanism of the students' comprehensive performance mainly based on "scores of theory test" is being gradually changed, and the activities such as scientific research, discipline competitions and social practices are playing an increasing important role in the students' comprehensive assessment. The produced achievements show that the proposed training model based on project-driven could stimulate the students' enthusiasm and initiative to participate in research activities and promote the training of students' ability of engineering practice and consciousness of innovation.

  17. Practical pulse engineering: Gradient ascent without matrix exponentiation

    NASA Astrophysics Data System (ADS)

    Bhole, Gaurav; Jones, Jonathan A.

    2018-06-01

    Since 2005, there has been a huge growth in the use of engineered control pulses to perform desired quantum operations in systems such as nuclear magnetic resonance quantum information processors. These approaches, which build on the original gradient ascent pulse engineering algorithm, remain computationally intensive because of the need to calculate matrix exponentials for each time step in the control pulse. In this study, we discuss how the propagators for each time step can be approximated using the Trotter-Suzuki formula, and a further speedup achieved by avoiding unnecessary operations. The resulting procedure can provide substantial speed gain with negligible costs in the propagator error, providing a more practical approach to pulse engineering.

  18. The Global Earthquake Model - Past, Present, Future

    NASA Astrophysics Data System (ADS)

    Smolka, Anselm; Schneider, John; Stein, Ross

    2014-05-01

    The Global Earthquake Model (GEM) is a unique collaborative effort that aims to provide organizations and individuals with tools and resources for transparent assessment of earthquake risk anywhere in the world. By pooling data, knowledge and people, GEM acts as an international forum for collaboration and exchange. Sharing of data and risk information, best practices, and approaches across the globe are key to assessing risk more effectively. Through consortium driven global projects, open-source IT development and collaborations with more than 10 regions, leading experts are developing unique global datasets, best practice, open tools and models for seismic hazard and risk assessment. The year 2013 has seen the completion of ten global data sets or components addressing various aspects of earthquake hazard and risk, as well as two GEM-related, but independently managed regional projects SHARE and EMME. Notably, the International Seismological Centre (ISC) led the development of a new ISC-GEM global instrumental earthquake catalogue, which was made publicly available in early 2013. It has set a new standard for global earthquake catalogues and has found widespread acceptance and application in the global earthquake community. By the end of 2014, GEM's OpenQuake computational platform will provide the OpenQuake hazard/risk assessment software and integrate all GEM data and information products. The public release of OpenQuake is planned for the end of this 2014, and will comprise the following datasets and models: • ISC-GEM Instrumental Earthquake Catalogue (released January 2013) • Global Earthquake History Catalogue [1000-1903] • Global Geodetic Strain Rate Database and Model • Global Active Fault Database • Tectonic Regionalisation Model • Global Exposure Database • Buildings and Population Database • Earthquake Consequences Database • Physical Vulnerabilities Database • Socio-Economic Vulnerability and Resilience Indicators • Seismic

  19. 40 CFR 86.004-40 - Heavy-duty engine rebuilding practices.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Provisions for Emission Regulations for 1977 and Later Model Year New Light-Duty Vehicles, Light-Duty Trucks and Heavy-Duty Engines, and for 1985 and Later Model Year New Gasoline Fueled, Natural Gas-Fueled... rebuilding practices. The provisions of this section are applicable to heavy-duty engines subject to model...

  20. First-Grade Engineers

    ERIC Educational Resources Information Center

    Bautista, Nazan Uludag; Peters, Kari Nichole

    2010-01-01

    Can students build a house that is cost effective and strong enough to survive strong winds, heavy rains, and earthquakes? First graders in Ms. Peter's classroom worked like engineers to answer this question. They participated in a design challenge that required them to plan like engineers and build strong and cost-effective houses that would fit…

  1. Second-Guessing Scientists and Engineers: Post Hoc Criticism and the Reform of Practice in Green Chemistry and Engineering.

    PubMed

    Lynch, William T

    2015-10-01

    The article examines and extends work bringing together engineering ethics and Science and Technology Studies, which had built upon Diane Vaughan's analysis of the Challenger shuttle accident as a test case. Reconsidering the use of her term "normalization of deviance," the article argues for a middle path between moralizing against and excusing away engineering practices contributing to engineering disaster. To explore an illustrative pedagogical case and to suggest avenues for constructive research developing this middle path, it examines the emergence of green chemistry and green engineering. Green chemistry began when Paul Anastas and John Warner developed a set of new rules for chemical synthesis that sought to learn from missed opportunities to avoid environmental damage in the twentieth century, an approach that was soon extended to engineering as well. Examination of tacit assumptions about historical counterfactuals in recent, interdisciplinary discussions of green chemistry illuminate competing views about the field's prospects. An integrated perspective is sought, addressing how both technical practice within chemistry and engineering and the influence of a wider "social movement" can play a role in remedying environmental problems.

  2. Stability assessment of structures under earthquake hazard through GRID technology

    NASA Astrophysics Data System (ADS)

    Prieto Castrillo, F.; Boton Fernandez, M.

    2009-04-01

    This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding

  3. Earthquake Response of Concrete Gravity Dams Including Hydrodynamic and Foundation Interaction Effects,

    DTIC Science & Technology

    1980-01-01

    standard procedure for Analysis of all types of civil engineering struc- tures. Early in its development, it became apparent that this method had...unique potentialities in the evaluation of stress in dams, and many of its earliest civil engineering applications concerned special problems associated...with such structures [3,4]. The earliest dynamic finite element analyses of civil engineering structures involved the earthquake response analysis of

  4. Evaluation of seismic performance of reinforced concrete (RC) buildings under near-field earthquakes

    NASA Astrophysics Data System (ADS)

    Moniri, Hassan

    2017-03-01

    Near-field ground motions are significantly severely affected on seismic response of structure compared with far-field ground motions, and the reason is that the near-source forward directivity ground motions contain pulse-long periods. Therefore, the cumulative effects of far-fault records are minor. The damage and collapse of engineering structures observed in the last decades' earthquakes show the potential of damage in existing structures under near-field ground motions. One important subject studied by earthquake engineers as part of a performance-based approach is the determination of demand and collapse capacity under near-field earthquake. Different methods for evaluating seismic structural performance have been suggested along with and as part of the development of performance-based earthquake engineering. This study investigated the results of illustrious characteristics of near-fault ground motions on the seismic response of reinforced concrete (RC) structures, by the use of Incremental Nonlinear Dynamic Analysis (IDA) method. Due to the fact that various ground motions result in different intensity-versus-response plots, this analysis is done again under various ground motions in order to achieve significant statistical averages. The OpenSees software was used to conduct nonlinear structural evaluations. Numerical modelling showed that near-source outcomes cause most of the seismic energy from the rupture to arrive in a single coherent long-period pulse of motion and permanent ground displacements. Finally, a vulnerability of RC building can be evaluated against pulse-like near-fault ground motions effects.

  5. Adapting Infrastructure and Civil Engineering Practice to a Changing Climate: Developing a Manual of Practice

    NASA Astrophysics Data System (ADS)

    Walker, D.; Ayyub, B. M.

    2017-12-01

    According to U.S. Census, new construction spending in the U.S. for 2014 was $993 Billion (roughly 6 percent of U.S. GDP). Informing the development of standards of engineering practice related to design and maintenance thus represents a significant opportunity to promote climate adaptation and mitigation, as well as community resilience. The climate science community informs us that extremes of climate and weather are changing from historical values and that the changes are driven substantially by emissions of greenhouse gases caused by human activities. Civil infrastructure systems traditionally have been designed, constructed, operated and maintained for appropriate probabilities of functionality, durability and safety while exposed to climate and weather extremes during their full service lives. Because of uncertainties in future greenhouse gas emissions and in the models for future climate and weather extremes, neither the climate science community nor the engineering community presently can define the statistics of future climate and weather extremes. The American Society for Civil Engineering's (ASCE) Committee on Adapting to a Changing Climate is actively involved in efforts internal and external to ASCE to promote understanding of the challenges climate change represents in engineering practice and to promote a re-examination of those practices that may need to change in light of changing climate. In addition to producing an ASCE e-book, as well as number of ASCE webinars, the Committee is currently developing a Manual of Practice intended to provide guidance for the development or enhancement of standards for infrastructure analysis and design in a world in which risk profiles are changing (non-stationarity) and climate change is a reality, but cannot be projected with a high degree of certainty. This presentation will explore both the need for such guidance as well as some of the challenges and opportunities facing its implementation.

  6. Disaster mitigation science for Earthquakes and Tsunamis -For resilience society against natural disasters-

    NASA Astrophysics Data System (ADS)

    Kaneda, Y.; Takahashi, N.; Hori, T.; Kawaguchi, K.; Isouchi, C.; Fujisawa, K.

    2017-12-01

    Destructive natural disasters such as earthquakes and tsunamis have occurred frequently in the world. For instance, 2004 Sumatra Earthquake in Indonesia, 2008 Wenchuan Earthquake in China, 2010 Chile Earthquake and 2011 Tohoku Earthquake in Japan etc., these earthquakes generated very severe damages. For the reduction and mitigation of damages by destructive natural disasters, early detection of natural disasters and speedy and proper evacuations are indispensable. And hardware and software developments/preparations for reduction and mitigation of natural disasters are quite important. In Japan, DONET as the real time monitoring system on the ocean floor is developed and deployed around the Nankai trough seismogenic zone southwestern Japan. So, the early detection of earthquakes and tsunamis around the Nankai trough seismogenic zone will be expected by DONET. The integration of the real time data and advanced simulation researches will lead to reduce damages, however, in the resilience society, the resilience methods will be required after disasters. Actually, methods on restorations and revivals are necessary after natural disasters. We would like to propose natural disaster mitigation science for early detections, evacuations and restorations against destructive natural disasters. This means the resilience society. In natural disaster mitigation science, there are lots of research fields such as natural science, engineering, medical treatment, social science and literature/art etc. Especially, natural science, engineering and medical treatment are fundamental research fields for natural disaster mitigation, but social sciences such as sociology, geography and psychology etc. are very important research fields for restorations after natural disasters. Finally, to realize and progress disaster mitigation science, human resource cultivation is indispensable. We already carried out disaster mitigation science under `new disaster mitigation research project on Mega

  7. Nurse Perspectives on the Practical, Emotional, and Professional Impacts of Living and Working in Post-earthquake Canterbury, New Zealand.

    PubMed

    Johal, Sarbjit S; Mounsey, Zoe; Brannelly, Petula; Johnston, David M

    2016-02-01

    This report explores nurses' perspectives following the Canterbury (New Zealand) 2010-2011 earthquake sequence and the subsequent recovery process. Problem Little is known about the experiences of health care professionals during a disaster recovery process, and this research generates insights about the challenges faced. Qualitative semi-structured interviews were undertaken with 11 nurses from the Christchurch (New Zealand) area to explore the challenges faced by the nurses during and following the earthquakes. The interviews took place three years after the start of the earthquake experience to enable exploration of longer term aspects of the recovery process. The interview transcripts were analyzed and coded using a grounded theory approach. The data analysis identified that the nurses had faced a number of challenges and these were characterized as practical, emotional, and professional. While some of the challenges were short-lived in the aftermath of the earthquakes, some were long-lasting due to the extended nature of the recovery process. Dealing with house damage, insurance negotiations, and working in damaged environments had a negative impact on the nurses. The nurses experienced a range of emotions, both negative and positive, after the disaster, though many had needed time to elapse before feeling able to reflect on their experiences. The findings suggest that secondary stressors have a negative impact on the psychosocial recovery process. The nurses recognized that they received support from others and were also required to focus on others. Keeping busy appeared to be the most common coping strategy. This lack of reflection on their experiences may have resulted in delayed emotional responses. Some of the nurses changed their work role, hours, and responsibilities suggesting that working in this environment was having a detrimental impact. The research indicates the challenges faced by nurses in the initial impact of the earthquakes and during the

  8. The 1868 Hayward Earthquake Alliance: A Case Study - Using an Earthquake Anniversary to Promote Earthquake Preparedness

    NASA Astrophysics Data System (ADS)

    Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.

    2008-12-01

    Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion

  9. Assessment of precast beam-column using capacity demand response spectrum subject to design basis earthquake and maximum considered earthquake

    NASA Astrophysics Data System (ADS)

    Ghani, Kay Dora Abd.; Tukiar, Mohd Azuan; Hamid, Nor Hayati Abdul

    2017-08-01

    Malaysia is surrounded by the tectonic feature of the Sumatera area which consists of two seismically active inter-plate boundaries, namely the Indo-Australian and the Eurasian Plates on the west and the Philippine Plates on the east. Hence, Malaysia experiences tremors from far distant earthquake occurring in Banda Aceh, Nias Island, Padang and other parts of Sumatera Indonesia. In order to predict the safety of precast buildings in Malaysia under near field ground motion the response spectrum analysis could be used for dealing with future earthquake whose specific nature is unknown. This paper aimed to develop of capacity demand response spectrum subject to Design Basis Earthquake (DBE) and Maximum Considered Earthquake (MCE) in order to assess the performance of precast beam column joint. From the capacity-demand response spectrum analysis, it can be concluded that the precast beam-column joints would not survive when subjected to earthquake excitation with surface-wave magnitude, Mw, of more than 5.5 Scale Richter (Type 1 spectra). This means that the beam-column joint which was designed using the current code of practice (BS8110) would be severely damaged when subjected to high earthquake excitation. The capacity-demand response spectrum analysis also shows that the precast beam-column joints in the prototype studied would be severely damaged when subjected to Maximum Considered Earthquake (MCE) with PGA=0.22g having a surface-wave magnitude of more than 5.5 Scale Richter, or Type 1 spectra.

  10. Sand Volcano Following Earthquake

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)

  11. Rapid earthquake hazard and loss assessment for Euro-Mediterranean region

    NASA Astrophysics Data System (ADS)

    Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru

    2010-10-01

    The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.

  12. Project-Based Manufacturing Engineering Practice at Ibaraki University and Its Outcomes

    NASA Astrophysics Data System (ADS)

    Yamasaki, Kazuhiko; Wang, Dong F.; Maekawa, Katsuhiro

    The real world experience of manufacturing processes from an idea stage to a final product must be related to classroom lectures in mechanical engineering curriculum, including design, materials engineering, dynamics and control. Various challenges and difficulties encountered during the manufacturing engineering practice also let students recognize their creativity as well as what kinds of knowledge is missing. Awareness is the start of growth. In line with this principle we have carried out the mechanical engineering practice for 10 years. Some modifications toward “project-based practice” , however, have been made through manufacturing engineers’ real activities. Drawing and specification, process control, cost management, and role-sharing arrangement are stressed during the semester course. The present paper describes how it works and what is left to improve further, such as a refinement of themes and a coaching method for bringing out the hidden talent in students.

  13. GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine

    NASA Astrophysics Data System (ADS)

    Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.

    2015-12-01

    The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.

  14. Evaluation of Earthquake-Induced Effects on Neighbouring Faults and Volcanoes: Application to the 2016 Pedernales Earthquake

    NASA Astrophysics Data System (ADS)

    Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.

    2017-12-01

    It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.

  15. Performance of Buildings in the 2009 Western Sumatra Earthquake

    NASA Astrophysics Data System (ADS)

    Deierlein, G.; Hart, T.; Alexander, N.; Hausler, E.; Henderson, S.; Wood, K.; Cedillos, V.; Wijanto, S.; Cabrera, C.; Rudianto, S.

    2009-12-01

    The M7.6 earthquake of 30 September 2009 in Western Sumatra, Indonesia caused significant damage and collapse to hundreds of buildings and the deaths of 1,117 people. In Padang City, with a population of about 900,000 people, building collapse was the primary cause of deaths and serious injuries (313 deaths and 431 serious injuries). The predominant building construction types in Padang are concrete moment frames with brick infill and masonry bearing wall systems. Concrete frames are common in multistory commercial retail buildings, offices, schools, and hotels; and masonry bearing wall systems are primarily used in low-rise (usually single story) residential and school buildings. In general, buildings that collapsed did not conform to modern seismic engineering practices that are required by the current Indonesian building code and would be expected in regions of moderate to high seismicity. While collapse of multi-story concrete buildings was more prevalent in older buildings (more than 10 years old), there were several newer buildings that collapsed. Primary deficiencies identified in collapsed or severely damaged buildings included: (a) soft or weak stories that failed in either by sidesway mechanisms or shear failures followed by loss of axial capacity of columns, (b) lack of ductile reinforcing bar detailing in concrete beams, columns, and beam-column joints, (c) poor quality concrete and mortar materials and workmanship, (d) vulnerable building configurations and designs with incomplete or deficient load paths, and (e) out-of-plane wall failures in unreinforced (or marginally reinforced) masonry. While these deficiencies may be expected in older buildings, damage and collapse to some modern (or recently rennovated buildings) indicates a lack of enforcement of building code provisions for design and construction quality assurance. Many new buildings whose structural systems were undamaged were closed due to extensive earthquake damage to brick infill walls

  16. Earthquake Damage Assessment Using Very High Resolution Satelliteimagery

    NASA Astrophysics Data System (ADS)

    Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.

    Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.

  17. The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence

    USGS Publications Warehouse

    Coordinated by Bakun, William H.; Prescott, William H.

    1993-01-01

    Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.

  18. PAGER--Rapid assessment of an earthquake?s impact

    USGS Publications Warehouse

    Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.

    2010-01-01

    PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.

  19. How to Retain Postgraduate Students in Engineering Programmes: A Practical Perspective

    ERIC Educational Resources Information Center

    Le, Khoa N.; Tam, Vivian W. Y.

    2008-01-01

    Six factors for pursuing an engineering postgraduate programme at Griffith University including (i) programme quality; (ii) employment prospects; (iii) practicality; (iv) personal interest; (v) popularity; and (vi) reputation; and 11 factors for not pursuing this engineering programme including (i) employment prospects; (ii) degree of difficulty;…

  20. Sharing best practices in teaching biomedical engineering design.

    PubMed

    Allen, R H; Acharya, S; Jancuk, C; Shoukas, A A

    2013-09-01

    In an effort to share best practices in undergraduate engineering design education, we describe the origin, evolution and the current status of the undergraduate biomedical engineering design team program at Johns Hopkins University. Specifically, we describe the program and judge the quality of the pedagogy by relating it to sponsor feedback, project outcomes, external recognition and student satisfaction. The general pedagogic practices, some of which are unique to Hopkins, that have worked best include: (1) having a hierarchical team structure, selecting team leaders the Spring semester prior to the academic year, and empowering them to develop and manage their teams, (2) incorporating a longitudinal component that incudes freshmen as part of the team, (3) having each team choose from among pre-screened clinical problems, (4) developing relationships and fostering medical faculty, industry and government to allow students access to engineers, clinicians and clinical environments as needed, (5) providing didactic sessions on topics related to requirements for the next presentation, (6) employing judges from engineering, medicine, industry and government to evaluate designs and provide constructive criticisms approximately once every 3-4 weeks and (7) requiring students to test the efficacy of their designs. Institutional support and resources are crucial for the design program to flourish. Most importantly, our willingness and flexibility to change the program each year based on feedback from students, sponsors, outcomes and judges provides a mechanism for us to test new approaches and continue or modify those that work well, and eliminate those that did not.

  1. Seismic design and engineering research at the U.S. Geological Survey

    USGS Publications Warehouse

    1988-01-01

    The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion.  Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.

  2. On some methods for assessing earthquake predictions

    NASA Astrophysics Data System (ADS)

    Molchan, G.; Romashkova, L.; Peresan, A.

    2017-09-01

    A regional approach to the problem of assessing earthquake predictions inevitably faces a deficit of data. We point out some basic limits of assessment methods reported in the literature, considering the practical case of the performance of the CN pattern recognition method in the prediction of large Italian earthquakes. Along with the classical hypothesis testing, a new game approach, the so-called parimutuel gambling (PG) method, is examined. The PG, originally proposed for the evaluation of the probabilistic earthquake forecast, has been recently adapted for the case of 'alarm-based' CN prediction. The PG approach is a non-standard method; therefore it deserves careful examination and theoretical analysis. We show that the PG alarm-based version leads to an almost complete loss of information about predicted earthquakes (even for a large sample). As a result, any conclusions based on the alarm-based PG approach are not to be trusted. We also show that the original probabilistic PG approach does not necessarily identifies the genuine forecast correctly among competing seismicity rate models, even when applied to extensive data.

  3. Molecular biomimetics: utilizing nature's molecular ways in practical engineering.

    PubMed

    Tamerler, Candan; Sarikaya, Mehmet

    2007-05-01

    In nature, proteins are the machinery that accomplish many functions through their specific recognition and interactions in biological systems from single-celled to multicellular organisms. Biomolecule-material interaction is accomplished via molecular specificity, leading to the formation of controlled structures and functions at all scales of dimensional hierarchy. Through evolution, molecular recognition and, consequently, functions developed through successive cycles of mutation and selection. Using biology as a guide, we can now understand, engineer and control peptide-material interactions and exploit these to tailor novel materials and systems for practical applications. We adapted combinatorial biology protocols to display peptide libraries, either on the cell surface or on phages, to select short peptides specific to a variety of practical materials systems. Following the selection step, we determined the kinetics and stability of peptide binding experimentally to understand the bound peptide structure via modeling and its assembly via atomic force microscopy. The peptides were further engineered to have multiple repeats or their amino acid sequences varied to tailor their function. Both nanoparticles and flat inorganic substrates containing multimaterials patterned at the nano- and microscales were used for self-directed immobilization of molecular constructs. The molecular biomimetic approach opens up new avenues for the design and utilization of multifunctional molecular systems with wide ranging applications, from tissue engineering, drug delivery and biosensors, to nanotechnology and bioremediation. Here we give examples of protein-mediated functional materials in biology, peptide selection and engineering with affinity to inorganics, demonstrate potential utilizations in materials science, engineering and medicine, and describe future prospects.

  4. A moment in time: emergency nurses and the Canterbury earthquakes.

    PubMed

    Richardson, S; Ardagh, M; Grainger, P; Robinson, V

    2013-06-01

    To outline the impact of the Canterbury, New Zealand (NZ) earthquakes on Christchurch Hospital, and the experiences of emergency nurses during this time. NZ has experienced earthquakes and aftershocks centred in the Canterbury region of the South Island. The location of these, around and within the major city of Christchurch, was unexpected and associated with previously unknown fault lines. While the highest magnitude quake occurred in September 2010, registering 7.1 on the Richter scale, it was the magnitude 6.3 event on 22 February 2011 which was associated with the greatest injury burden and loss of life. Staff working in the only emergency department in the city were faced with an external emergency while also being directly affected as part of the disaster. SOURCES OF EVIDENCE: This paper developed following interviews with nurses who worked during this period, and draws on literature related to healthcare responses to earthquakes and natural disasters. The establishment of an injury database allowed for an accurate picture to emerge of the injury burden, and each of the authors was present and worked in a clinical capacity during the earthquake. Nurses played a significant role in the response to the earthquakes and its aftermath. However, little is known regarding the impact of this, either in personal or professional terms. This paper presents an overview of the earthquakes and experiences of nurses working during this time, identifying a range of issues that will benefit from further exploration and research. It seeks to provide a sense of the experiences and the potential meanings that were derived from being part of this 'moment in time'. Examples of innovations in practice emerged during the earthquake response and a number of recommendations for nursing practice are identified. © 2013 The Authors. International Nursing Review © 2013 International Council of Nurses.

  5. Overview of the critical disaster management challenges faced during Van 2011 earthquakes.

    PubMed

    Tolon, Mert; Yazgan, Ufuk; Ural, Derin N; Goss, Kay C

    2014-01-01

    On October 23, 2011, a M7.2 earthquake caused damage in a widespread area in the Van province located in eastern Turkey. This strong earthquake was followed by a M5.7 earthquake on November 9, 2011. This sequence of damaging earthquakes led to 644 fatalities. The management during and after these earthquake disaster imposed many critical challenges. In this article, an overview of these challenges is presented based on the observations by the authors in the aftermath of this disaster. This article presents the characteristics of 2011 Van earthquakes. Afterward, the key information related to the four main phases (ie, preparedness, mitigation, response, and recovery) of the disaster in Van is presented. The potential strategies that can be taken to improve the disaster management practice are identified, and a set of recommendations are proposed to improve the existing situation.

  6. Operational earthquake forecasting can enhance earthquake preparedness

    USGS Publications Warehouse

    Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.

    2014-01-01

    We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).

  7. Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut

    USGS Publications Warehouse

    Jones, Lucile M.; ,

    2009-01-01

    The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these

  8. Long aftershock sequences within continents and implications for earthquake hazard assessment.

    PubMed

    Stein, Seth; Liu, Mian

    2009-11-05

    One of the most powerful features of plate tectonics is that the known plate motions give insight into both the locations and average recurrence interval of future large earthquakes on plate boundaries. Plate tectonics gives no insight, however, into where and when earthquakes will occur within plates, because the interiors of ideal plates should not deform. As a result, within plate interiors, assessments of earthquake hazards rely heavily on the assumption that the locations of small earthquakes shown by the short historical record reflect continuing deformation that will cause future large earthquakes. Here, however, we show that many of these recent earthquakes are probably aftershocks of large earthquakes that occurred hundreds of years ago. We present a simple model predicting that the length of aftershock sequences varies inversely with the rate at which faults are loaded. Aftershock sequences within the slowly deforming continents are predicted to be significantly longer than the decade typically observed at rapidly loaded plate boundaries. These predictions are in accord with observations. So the common practice of treating continental earthquakes as steady-state seismicity overestimates the hazard in presently active areas and underestimates it elsewhere.

  9. Seismic risk management of non-engineered buildings

    NASA Astrophysics Data System (ADS)

    Winar, Setya

    Earthquakes have long been feared as one of nature's most terrifying and devastating events. Although seismic codes clearly exist in countries with a high seismic risk to save lives and human suffering, earthquakes still continue to cause tragic events with high death tolls, particularly due to the collapse of widespread non-engineered buildings with non-seismic resistance in developing countries such as Indonesia. The implementation of seismic codes in non-engineered construction is the key to ensuring earthquake safety. In fact, such implementation is not simple, because it comprises all forms of cross disciplinary and cross sectoral linkages at different levels of understanding, commitment, and skill. This fact suggests that a widely agreed framework can help to harmonise the various perspectives. Hence, this research is aimed at developing an integrated framework for guiding and monitoring seismic risk reduction of non-engineered buildings in Indonesia via a risk management method.Primarily, the proposed framework for the study has drawn heavily on wider literature, the three existing frameworks around the world, and on the contribution of various stakeholders who participated in the study. A postal questionnaire survey, selected interviews, and workshop event constituted the primary data collection methods. As a robust framework needed to be achieved, the following two workshop events, which were conducted in Yogyakarta City and Bengkulu City in Indonesia, were carried out for practicality, validity, and moderation or any identifiable improvement requirements. The data collected was analysed with the assistance of SPSS and NVivo software programmes.This research found that the content of the proposed framework comprises 63 pairs of characteristic-indicators complemented by (a) three important factors of effective seismic risk management of non-engineered buildings, (b) three guiding principles for sustainable dissemination to the grass root communities and (c

  10. The quest for better quality-of-life - learning from large-scale shaking table tests

    NASA Astrophysics Data System (ADS)

    Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.

    2010-12-01

    Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural

  11. Earthquakes for Kids

    MedlinePlus

    ... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ...

  12. A numerical simulation strategy on occupant evacuation behaviors and casualty prediction in a building during earthquakes

    NASA Astrophysics Data System (ADS)

    Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai

    2018-01-01

    Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.

  13. Earthquake Protection Measures for People with Disabilities

    NASA Astrophysics Data System (ADS)

    Gountromichou, C.; Kourou, A.; Kerpelis, P.

    2009-04-01

    The problem of seismic safety for people with disabilities not only exists but is also urgent and of primary importance. Working towards disability equality, Earthquake Planning and Protection Organization of Greece (E.P.P.O.) has developed an educational scheme for people with disabilities in order to guide them to develop skills to protect themselves as well as to take the appropriate safety measures before, during and after an earthquake. The framework of this initiative includes a number of actions have been already undertaken, including the following: a. Recently, the main guidelines have been published to help people who have physical, cognitive, visual, or auditory disabilities to cope with a destructive earthquake. Of great importance, in case of people with disabilities, is to be prepared for the disaster, with several measures that must be taken starting today. In the pre-earthquake period, it is important that these people, in addition to other measures, do the following: - Create a Personal Support Network The Personal Support Network should be a group of at least three trustful people that can assist the disabled person to prepare for a disastrous event and to recover after it. - Complete a Personal Assessment The environment may change after a destructive earthquake. People with disabilities are encouraged to make a list of their personal needs and their resources for meeting them in a disaster environment. b. Lectures and training seminars on earthquake protection are given for students, teachers and educators in Special Schools for disabled people, mainly for informing and familiarizing them with earthquakes and with safety measures. c. Many earthquake drills have already taken place, for each disability, in order to share good practices and lessons learned to further disaster reduction and to identify gaps and challenges. The final aim of this action is all people with disabilities to be well informed and motivated towards a culture of earthquake

  14. Improving the Practical Education of Chemical and Pharmaceutical Engineering Majors in Chinese Universities

    ERIC Educational Resources Information Center

    Zhao, Feng-qing; Yu, Yi-feng; Ren, Shao-feng; Liu, Shao-jie; Rong, Xin-yu

    2014-01-01

    Practical education in chemical engineering has drawn increasing attention in recent years. This paper discusses two approaches to teaching and learning about experiments among upper-level chemical and pharmaceutical engineering majors in China. On the basis of years of experience in teaching chemical and pharmaceutical engineering, we propose the…

  15. MATLAB Meets LEGO Mindstorms--A Freshman Introduction Course into Practical Engineering

    ERIC Educational Resources Information Center

    Behrens, A.; Atorf, L.; Schwann, R.; Neumann, B.; Schnitzler, R.; Balle, J.; Herold, T.; Telle, A.; Noll, T. G.; Hameyer, K.; Aach, T.

    2010-01-01

    In today's teaching and learning approaches for first-semester students, practical courses more and more often complement traditional theoretical lectures. This practical element allows an early insight into the real world of engineering, augments student motivation, and enables students to acquire soft skills early. This paper describes a new…

  16. Earthquake vulnerability assessment of buildings of ward no. 8 of Haldwani-Kathgodam Municipal Corporation, Uttarakhand, India

    NASA Astrophysics Data System (ADS)

    Bora, Kritika; Pande, Ravindra K.

    2017-07-01

    "Earthquake does not kill people; it is the building which kills people". Earthquake is a sudden event below the surface of the earth which results in vertical and horizontal waves that causes destruction. The main aim of this research is to bring into light the unplanned and non-engineered construction practices growing in the Urban areas. Lack of space and continuous migration from hills has resulted in Multistorey construction. The present study is based on primary data collection through Rapid Visual Screening for the assessment of vulnerability of buildings. "Haldwani-Kathgodam being a new Municipal Corporation located in the foot hills of Himalayas is facing same problem. The seismic zonation brings this area into zone 4 of damage risk. Therefore an assessment to estimate the risk of the built up environment is important. This paper presents a systematic and useful way of assessing physical vulnerability of buildings. The present paper will show how the growing pressure on urban area tends to make the built up environment vulnerable towards seismic activities. The challenge today is to make our living environment safe for living. The day by day growing population pressure on urban area as a migration trend in developing countries is leading to high rise building, no planning and reckless construction. For the sake of saving some money people usually do not take the approval from structural engineer. This unplanned and haphazard construction proves non-resistant towards earthquake and brings lives and properties to death and a stand still. The total no. of household in the current study area is 543 whereas the total population is 2497 (2011). The recent formation of Himalayas makes the area more sensitive towards seismic event. The closeness to the Main Boundary thrust brings it to zone 4 in the Seismic Zonation of India i.e., High Damage Risk Zone

  17. Missing great earthquakes

    USGS Publications Warehouse

    Hough, Susan E.

    2013-01-01

    The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.

  18. Development of Earthquake Emergency Response Plan for Tribhuvan International Airport, Kathmandu, Nepal

    DTIC Science & Technology

    2013-02-01

    Kathmandu, Nepal 5a. CONTRACT NUMBER W911NF-12-1-0282 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER ...In the past, big earthquakes in Nepal (see Figure 1.1) have caused a huge number of casualties and damage to structures. The Great Nepal -Bihar...UBC Earthquake Engineering Research Facility 2235 East Mall, Vancouver, BC, Canada V6T 1Z4 Phone : 604 822-6203 Fax: 604 822-6901 E-mail

  19. Implementing Effective Mission Systems Engineering Practices During Early Project Formulation Phases

    NASA Technical Reports Server (NTRS)

    Moton, Tryshanda

    2016-01-01

    Developing and implementing a plan for a NASA space mission can be a complicated process. The needs, goals, and objectives of any proposed mission or technology must be assessed early in the Project Life Cycle. The key to successful development of a space mission or flight project is the inclusion of systems engineering in early project formulation, namely during Pre-phase A, Phase A, and Phase B of the NASA Project Life Cycle. When a space mission or new technology is in pre-development, or "pre-Formulation", feasibility must be determined based on cost, schedule, and risk. Inclusion of system engineering during project formulation is key because in addition to assessing feasibility, design concepts are developed and alternatives to design concepts are evaluated. Lack of systems engineering involvement early in the project formulation can result in increased risks later in the implementation and operations phases of the project. One proven method for effective systems engineering practice during the pre-Formulation Phase is the use of a mission conceptual design or technology development laboratory, such as the Mission Design Lab (MDL) at NASA's Goddard Space Flight Center (GSFC). This paper will review the engineering process practiced routinely in the MDL for successful mission or project development during the pre-Formulation Phase.

  20. Feasibility study of short-term earthquake prediction using ionospheric anomalies immediately before large earthquakes

    NASA Astrophysics Data System (ADS)

    Heki, K.; He, L.

    2017-12-01

    We showed that positive and negative electron density anomalies emerge above the fault immediately before they rupture, 40/20/10 minutes before Mw9/8/7 earthquakes (Heki, 2011 GRL; Heki and Enomoto, 2013 JGR; He and Heki 2017 JGR). These signals are stronger for earthquake with larger Mw and under higher background vertical TEC (total electron conetent) (Heki and Enomoto, 2015 JGR). The epicenter, the positive and the negative anomalies align along the local geomagnetic field (He and Heki, 2016 GRL), suggesting electric fields within ionosphere are responsible for making the anomalies (Kuo et al., 2014 JGR; Kelley et al., 2017 JGR). Here we suppose the next Nankai Trough earthquake that may occur within a few tens of years in Southwest Japan, and will discuss if we can recognize its preseismic signatures in TEC by real-time observations with GNSS.During high geomagnetic activities, large-scale traveling ionospheric disturbances (LSTID) often propagate from auroral ovals toward mid-latitude regions, and leave similar signatures to preseismic anomalies. This is a main obstacle to use preseismic TEC changes for practical short-term earthquake prediction. In this presentation, we show that the same anomalies appeared 40 minutes before the mainshock above northern Australia, the geomagnetically conjugate point of the 2011 Tohoku-oki earthquake epicenter. This not only demonstrates that electric fields play a role in making the preseismic TEC anomalies, but also offers a possibility to discriminate preseismic anomalies from those caused by LSTID. By monitoring TEC in the conjugate areas in the two hemisphere, we can recognize anomalies with simultaneous onset as those caused by within-ionosphere electric fields (e.g. preseismic anomalies, night-time MSTID) and anomalies without simultaneous onset as gravity-wave origin disturbances (e.g. LSTID, daytime MSTID).

  1. Losses to single-family housing from ground motions in the 1994 Northridge, California, earthquake

    USGS Publications Warehouse

    Wesson, R.L.; Perkins, D.M.; Leyendecker, E.V.; Roth, R.J.; Petersen, M.D.

    2004-01-01

    The distributions of insured losses to single-family housing following the 1994 Northridge, California, earthquake for 234 ZIP codes can be satisfactorily modeled with gamma distributions. Regressions of the parameters in the gamma distribution on estimates of ground motion, derived from ShakeMap estimates or from interpolated observations, provide a basis for developing curves of conditional probability of loss given a ground motion. Comparison of the resulting estimates of aggregate loss with the actual aggregate loss gives satisfactory agreement for several different ground-motion parameters. Estimates of loss based on a deterministic spatial model of the earthquake ground motion, using standard attenuation relationships and NEHRP soil factors, give satisfactory results for some ground-motion parameters if the input ground motions are increased about one and one-half standard deviations above the median, reflecting the fact that the ground motions for the Northridge earthquake tended to be higher than the median ground motion for other earthquakes with similar magnitude. The results give promise for making estimates of insured losses to a similar building stock under future earthquake loading. ?? 2004, Earthquake Engineering Research Institute.

  2. [Medical rescue of China National Earthquake Disaster Emergency Search and Rescue Team in Lushan earthquake].

    PubMed

    Liu, Ya-hua; Yang, Hui-ning; Liu, Hui-liang; Wang, Fan; Hu, Li-bin; Zheng, Jing-chen

    2013-05-01

    To summarize and analyze the medical mission of China National Earthquake Disaster Emergency Search and Rescue Team (CNESAR) in Lushan earthquake, to promote the medical rescue effectiveness incorporated with search and rescue. Retrospective analysis of medical work data by CNESAR from April 21th, 2013 to April 27th during Lushan earthquake rescue, including the medical staff dispatch and the wounded case been treated. The reasonable medical corps was composed by 22 members, including 2 administrators, 11 doctors [covering emergency medicine, orthopedics (joints and limbs, spinal), obstetrics and gynecology, gastroenterology, cardiology, ophthalmology, anesthesiology, medical rescue, health epidemic prevention, clinical laboratory of 11 specialties], 1 ultrasound technician, 5 nurses, 1 pharmacist, 1 medical instrument engineer and 1 office worker for propaganda. There were two members having psychological consultants qualifications. The medical work were carried out in seven aspects, including medical care assurance for the CNESAR members, first aid cooperation with search and rescue on site, clinical work in refugees' camp, medical round service for scattered village people, evacuation for the wounded, mental intervention, and the sanitary and anti-epidemic work. The medical work covered 24 small towns, and medical staff established 3 medical clinics at Taiping Town, Shuangshi Town of Lushan County and Baoxing County. Medical rescue, mental intervention for the old and kids, and sanitary and anti-epidemic were performed at the above sites. The medical corps had successful evacuated 2 severe wounded patients and treated the wounded over thousands. Most of the wounded were soft tissue injuries, external injury, respiratory tract infections, diarrhea, and heat stroke. Compared with the rescue action in 2008 Wenchuan earthquake, the aggregation and departure of rescue team in Lushan earthquake, the traffic control order in disaster area, the self-aid and buddy aid

  3. Modeling the behavior of an earthquake base-isolated building.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coveney, V. A.; Jamil, S.; Johnson, D. E.

    1997-11-26

    Protecting a structure against earthquake excitation by supporting it on laminated elastomeric bearings has become a widely accepted practice. The ability to perform accurate simulation of the system, including FEA of the bearings, would be desirable--especially for key installations. In this paper attempts to model the behavior of elastomeric earthquake bearings are outlined. Attention is focused on modeling highly-filled, low-modulus, high-damping elastomeric isolator systems; comparisons are made between standard triboelastic solid model predictions and test results.

  4. Engineering aspects of seismological studies in Peru

    USGS Publications Warehouse

    Ocola, L.

    1982-01-01

    In retrospect, the Peruvian national long-range earthquake-study program began after the catastrophic earthquake of May 31, 1970. This earthquake triggered a large snow avalanche from Huascaran mountain, killing over 60,000 people, and covering with mud small cities and tens of villages in the Andean valley of Callejon de Huaylas, Huaraz. Since then, great efforts have been made to learn about the natural seismic environment and its engineering and social aspects. The Organization of American States (OAS)has been one of the most important agencies in the development of the program. 

  5. Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)

    NASA Astrophysics Data System (ADS)

    Applegate, D.

    2010-12-01

    This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation

  6. Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building

    USGS Publications Warehouse

    Kohler, M.D.; Davis, P.M.; Safak, E.

    2005-01-01

    Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.

  7. Response and recovery lessons from the 2010-2011 earthquake sequence in Canterbury, New Zealand

    USGS Publications Warehouse

    Pierepiekarz, Mark; Johnston, David; Berryman, Kelvin; Hare, John; Gomberg, Joan S.; Williams, Robert A.; Weaver, Craig S.

    2014-01-01

    The impacts and opportunities that result when low-probability moderate earthquakes strike an urban area similar to many throughout the US were vividly conveyed in a one-day workshop in which social and Earth scientists, public officials, engineers, and an emergency manager shared their experiences of the earthquake sequence that struck the city of Christchurch and surrounding Canterbury region of New Zealand in 2010-2011. Without question, the earthquake sequence has had unprecedented impacts in all spheres on New Zealand society, locally to nationally--10% of the country's population was directly impacted and losses total 8-10% of their GDP. The following paragraphs present a few lessons from Christchurch.

  8. A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM

    NASA Astrophysics Data System (ADS)

    Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.

    2007-12-01

    The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.

  9. Design and Control of Chemical Grouting : Volume 3 - Engineering Practice

    DOT National Transportation Integrated Search

    1983-04-01

    Recent improvements in the engineering practice of chemical grouting have provided increased confidence in this method of ground modification. Designers can significantly improve the success of chemical grouting by defining their grouting program obj...

  10. Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake

    NASA Astrophysics Data System (ADS)

    Durukal, E.; Sesetyan, K.; Erdik, M.

    2009-04-01

    The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing

  11. GEM - The Global Earthquake Model

    NASA Astrophysics Data System (ADS)

    Smolka, A.

    2009-04-01

    Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a

  12. Earthquake potential revealed by tidal influence on earthquake size-frequency statistics

    NASA Astrophysics Data System (ADS)

    Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki

    2016-11-01

    The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.

  13. Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings.

    PubMed

    Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P; Kravitz, Richard L; Owen, Richard R; Sullivan, J Greer; Wu, Albert W; Di Capua, Paul; Hoagwood, Kimberly Eaton

    2015-09-01

    Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context.

  14. Integrating Science and Engineering to Implement Evidence-Based Practices in Health Care Settings

    PubMed Central

    Wu, Shinyi; Duan, Naihua; Wisdom, Jennifer P.; Kravitz, Richard L.; Owen, Richard R.; Sullivan, Greer; Wu, Albert W.; Di Capua, Paul; Hoagwood, Kimberly Eaton

    2015-01-01

    Integrating two distinct and complementary paradigms, science and engineering, may produce more effective outcomes for the implementation of evidence-based practices in health care settings. Science formalizes and tests innovations, whereas engineering customizes and optimizes how the innovation is applied tailoring to accommodate local conditions. Together they may accelerate the creation of an evidence-based healthcare system that works effectively in specific health care settings. We give examples of applying engineering methods for better quality, more efficient, and safer implementation of clinical practices, medical devices, and health services systems. A specific example was applying systems engineering design that orchestrated people, process, data, decision-making, and communication through a technology application to implement evidence-based depression care among low-income patients with diabetes. We recommend that leading journals recognize the fundamental role of engineering in implementation research, to improve understanding of design elements that create a better fit between program elements and local context. PMID:25217100

  15. Assessing the Utility of and Improving USGS Earthquake Hazards Program Products

    NASA Astrophysics Data System (ADS)

    Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.

    2010-12-01

    A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.

  16. Practical Strategy on the Subject of “Science and Ethics” for Overcoming Hybrid Engineering Ethics Education

    NASA Astrophysics Data System (ADS)

    Yasui, Yoshiaki

    The issue of economic globalization and JABEE (Japan Accreditation Board for Engineering Education) mean that education on engineering ethics has now become increasingly important for science-engineering students who will become the next generation of engineers. This is clearly indicated when engineers are made professionally responsible for various unfortunate accidents that happen during daily life in society. Learning hybrid engineering ethics is an essential part of the education of the humanities and sciences. This paper treats the contents for the subject of “Science and Ethics” drawing on several years of practice and the fruits of studying science and engineering ethics at the faculty of science-engineering in university. This paper can be considered to be a practical strategy to the formation of morality.

  17. Earthquakes and depleted gas reservoirs: which comes first?

    NASA Astrophysics Data System (ADS)

    Mucciarelli, M.; Donda, F.; Valensise, G.

    2014-12-01

    While scientists are paying increasing attention to the seismicity potentially induced by hydrocarbon exploitation, little is known about the reverse problem, i.e. the impact of active faulting and earthquakes on hydrocarbon reservoirs. The recent 2012 earthquakes in Emilia, Italy, raised concerns among the public for being possibly human-induced, but also shed light on the possible use of gas wells as a marker of the seismogenic potential of an active fold-and-thrust belt. Based on the analysis of over 400 borehole datasets from wells drilled along the Ferrara-Romagna Arc, a large oil and gas reserve in the southeastern Po Plain, we found that the 2012 earthquakes occurred within a cluster of sterile wells surrounded by productive ones. Since the geology of the productive and sterile areas is quite similar, we suggest that past earthquakes caused the loss of all natural gas from the potential reservoirs lying above their causative faults. Our findings have two important practical implications: (1) they may allow major seismogenic zones to be identified in areas of sparse seismicity, and (2) suggest that gas should be stored in exploited reservoirs rather than in sterile hydrocarbon traps or aquifers as this is likely to reduce the hazard of triggering significant earthquakes.

  18. An earthquake strength scale for the media and the public

    USGS Publications Warehouse

    Johnston, A.C.

    1990-01-01

    A local engineer, E.P Hailey, pointed this problem out to me shortly after the Loma Prieta earthquake. He felt that three problems limited the usefulness of magnitude in describing an earthquake to the public; (1) most people don't understand that it is not a linear scale; (2) of those who do realized the scale is not linear, very few understand the difference of a factor of ten in ground motion and 32 in energy release between points on the scale; and (3) even those who understand the first two points have trouble putting a given magnitude value into terms they can relate to. In summary, Mr. Hailey wondered why seismologists can't come up with an earthquake scale that doesn't confuse everyone and that conveys a sense of true relative size. Here, then, is m attempt to construct such a scale. 

  19. Geotechnical aspects of the January 2003 Tecoma'n, Mexico, earthquake

    USGS Publications Warehouse

    Wartman, Joseph; Rodriguez-Marek, Adrian; Macari, Emir J.; Deaton, Scott; Ramirez-Reynaga, Marti'n; Ochoa, Carlos N.; Callan, Sean; Keefer, David; Repetto, Pedro; Ovando-Shelley, Efrai'n

    2005-01-01

    Ground failure was the most prominent geotechnical engineering feature of the 21 January 2003 Mw 7.6 Tecoma´n earthquake. Ground failure impacted structures, industrial facilities, roads, water supply canals, and other critical infrastructure in the state of Colima and in parts of the neighboring states of Jalisco and Michoaca´n. Landslides and soil liquefaction were the most common type of ground failure, followed by seismic compression of unsaturated materials. Reinforced earth structures generally performed well during the earthquake, though some structures experienced permanent lateral deformations up to 10 cm. Different ground improvement techniques had been used to enhance the liquefaction resistance of several sites in the region, all of which performed well and exhibited no signs of damage or significant ground deformation. Earth dams in the region experienced some degree of permanent deformation but remained fully functional after the earthquake.

  20. Understanding earthquake from the granular physics point of view — Causes of earthquake, earthquake precursors and predictions

    NASA Astrophysics Data System (ADS)

    Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing

    2018-03-01

    We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.

  1. Redefining Earthquakes and the Earthquake Machine

    ERIC Educational Resources Information Center

    Hubenthal, Michael; Braile, Larry; Taber, John

    2008-01-01

    The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…

  2. A triangular approach to integrate research, education and practice in higher engineering education

    NASA Astrophysics Data System (ADS)

    Heikkinen, Eetu-Pekka; Jaako, Juha; Hiltunen, Jukka

    2017-11-01

    Separate approaches in engineering education, research and practice are not very useful when preparing students for working life; instead, integration of education, research and industrial practices is needed. A triangular approach (TA) as a method to accomplish this integration and as a method to provide students with integrated expertise is proposed. The results from the application of TA, both at the course and programme level, indicate that the approach is suitable for developing engineering education. The student pass rate for courses where TA has been used has been higher than for previous approaches, and the student feedback has been very positive. Although TA aims to take both theoretical and practical aspects of engineering as well as research and education into account, the approach concentrates mainly on activities and therefore leaves the goals of these activities as well as the values behind these goals uncovered.

  3. Source time function properties indicate a strain drop independent of earthquake depth and magnitude.

    PubMed

    Vallée, Martin

    2013-01-01

    The movement of tectonic plates leads to strain build-up in the Earth, which can be released during earthquakes when one side of a seismic fault suddenly slips with respect to the other. The amount of seismic strain release (or 'strain drop') is thus a direct measurement of a basic earthquake property, that is, the ratio of seismic slip over the dimension of the ruptured fault. Here the analysis of a new global catalogue, containing ~1,700 earthquakes with magnitude larger than 6, suggests that strain drop is independent of earthquake depth and magnitude. This invariance implies that deep earthquakes are even more similar to their shallow counterparts than previously thought, a puzzling finding as shallow and deep earthquakes are believed to originate from different physical mechanisms. More practically, this property contributes to our ability to predict the damaging waves generated by future earthquakes.

  4. Distinguishing megathrust from intraplate earthquakes using lacustrine turbidites (Laguna Lo Encañado, Central Chile)

    NASA Astrophysics Data System (ADS)

    Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco

    2017-04-01

    triggered by megathrust earthquakes. These findings are an important step forward in the interpretation of lacustrine turbidites in subduction settings, and will eventually improve hazard assessments based on such paleoseismic records in the study area, and in other subduction zones. References Howarth et al., 2014. Lake sediments record high intensity shaking that provides insight into the location and rupture length of large earthquakes on the Alpine Fault, New Zealand. Earth and Planetary Science Letters 403, 340-351. Lomnitz, 1960. A study of the Maipo Valley earthquakes of September 4, 1958, Second World Conference on Earthquake Engineering, Tokyo and Kyoto, Japan, pp. 501-520. Sepulveda et al., 2008. New Findings on the 1958 Las Melosas Earthquake Sequence, Central Chile: Implications for Seismic Hazard Related to Shallow Crustal Earthquakes in Subduction Zones. Journal of Earthquake Engineering 12, 432-455. Van Daele et al., 2015. A comparison of the sedimentary records of the 1960 and 2010 great Chilean earthquakes in 17 lakes: Implications for quantitative lacustrine palaeoseismology. Sedimentology 62, 1466-1496.

  5. USGS Training in Afghanistan: Modern Earthquake Hazards Assessments

    NASA Astrophysics Data System (ADS)

    Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.

    2007-05-01

    Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."

  6. Quantification of social contributions to earthquake mortality

    NASA Astrophysics Data System (ADS)

    Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.

    2013-12-01

    Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.

  7. OMG Earthquake! Can Twitter improve earthquake response?

    NASA Astrophysics Data System (ADS)

    Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.

    2009-12-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information

  8. The use of earthquake rate changes as a stress meter at Kilauea volcano.

    PubMed

    Dieterich, J; Cayol, V; Okubo, P

    2000-11-23

    Stress changes in the Earth's crust are generally estimated from model calculations that use near-surface deformation as an observational constraint. But the widespread correlation of changes of earthquake activity with stress has led to suggestions that stress changes might be calculated from earthquake occurrence rates obtained from seismicity catalogues. Although this possibility has considerable appeal, because seismicity data are routinely collected and have good spatial and temporal resolution, the method has not yet proven successful, owing to the non-linearity of earthquake rate changes with respect to both stress and time. Here, however, we present two methods for inverting earthquake rate data to infer stress changes, using a formulation for the stress- and time-dependence of earthquake rates. Application of these methods at Kilauea volcano, in Hawaii, yields good agreement with independent estimates, indicating that earthquake rates can provide a practical remote-sensing stress meter.

  9. Isolating social influences on vulnerability to earthquake shaking: identifying cost-effective mitigation strategies.

    NASA Astrophysics Data System (ADS)

    Bhloscaidh, Mairead Nic; McCloskey, John; Pelling, Mark; Naylor, Mark

    2013-04-01

    Until expensive engineering solutions become more universally available, the objective targeting of resources at demonstrably effective, low-cost interventions might help reverse the trend of increasing mortality in earthquakes. Death tolls in earthquakes are the result of complex interactions between physical effects, such as the exposure of the population to strong shaking, and the resilience of the exposed population along with supporting critical infrastructures and institutions. The identification of socio-economic factors that contribute to earthquake mortality is crucial to identifying and developing successful risk management strategies. Here we develop a quantitative methodology more objectively to assess the ability of communities to withstand earthquake shaking, focusing on, in particular, those cases where risk management performance appears to exceed or fall below expectations based on economic status. Using only published estimates of the shaking intensity and population exposure for each earthquake, data that is available for earthquakes in countries irrespective of their level of economic development, we develop a model for mortality based on the contribution of population exposure to shaking only. This represents an attempt to remove, as far as possible, the physical causes of mortality from our analysis (where we consider earthquake engineering to reduce building collapse among the socio-economic influences). The systematic part of the variance with respect to this model can therefore be expected to be dominated by socio-economic factors. We find, as expected, that this purely physical analysis partitions countries in terms of basic socio-economic measures, for example GDP, focusing analytical attention on the power of economic measures to explain variance in observed distributions of earthquake risk. The model allows the definition of a vulnerability index which, although broadly it demonstrates the expected income-dependence of vulnerability to

  10. Real-time seismic monitoring of the integrated cape girardeau bridge array and recorded earthquake response

    USGS Publications Warehouse

    Celebi, M.

    2006-01-01

    This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. The bridge was designed for a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and at surface and downhole free-field arrays of the bridge. The paper also presents the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to assess the performance of the bridge, to check design parameters, including the comparison of dynamic characteristics with actual response, and to better design future similar bridges. Preliminary analyses of ambient and low amplitude small earthquake data reveal specific response characteristics of the bridge and the free-field. There is evidence of coherent tower, cable, deck interaction that sometimes results in amplified ambient motions. Motions at the lowest tri-axial downhole accelerometers on both MO and IL sides are practically free from any feedback from the bridge. Motions at the mid-level and surface downhole accelerometers are influenced significantly by feedback due to amplified ambient motions of the bridge. Copyright ASCE 2006.

  11. Twitter earthquake detection: Earthquake monitoring in a social world

    USGS Publications Warehouse

    Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.

    2011-01-01

    The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.

  12. Awareness and understanding of earthquake hazards at school

    NASA Astrophysics Data System (ADS)

    Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi

    2014-05-01

    Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train

  13. Crustal earthquake triggering by pre-historic great earthquakes on subduction zone thrusts

    USGS Publications Warehouse

    Sherrod, Brian; Gomberg, Joan

    2014-01-01

    Triggering of earthquakes on upper plate faults during and shortly after recent great (M>8.0) subduction thrust earthquakes raises concerns about earthquake triggering following Cascadia subduction zone earthquakes. Of particular regard to Cascadia was the previously noted, but only qualitatively identified, clustering of M>~6.5 crustal earthquakes in the Puget Sound region between about 1200–900 cal yr B.P. and the possibility that this was triggered by a great Cascadia thrust subduction thrust earthquake, and therefore portends future such clusters. We confirm quantitatively the extraordinary nature of the Puget Sound region crustal earthquake clustering between 1200–900 cal yr B.P., at least over the last 16,000. We conclude that this cluster was not triggered by the penultimate, and possibly full-margin, great Cascadia subduction thrust earthquake. However, we also show that the paleoseismic record for Cascadia is consistent with conclusions of our companion study of the global modern record outside Cascadia, that M>8.6 subduction thrust events have a high probability of triggering at least one or more M>~6.5 crustal earthquakes.

  14. Gas and Dust Phenomena of Mega-earthquakes and the Cause

    NASA Astrophysics Data System (ADS)

    Yue, Z.

    2013-12-01

    dense natural (methane) gas suddenly escaped from deep crust traps along deep fault zones. References Yue, ZQ, 2009. The source of energy power directly causing the May 12 Wenchuan Earthquake: Huge extremely pressurized natural gases trapped in deep Longmen Shan faults. News Journal of China Society of Rock Mechanics and Engineering, 86 (2009 (2)), 45-50. Yue, ZQ, 2010. Features and mechanism of coseismic surface ruptures by Wenchuan Earthquake. in Rock Stress and Earthquake, edited by Furen Xie, Taylor & Francis Group, London, ISBN 978-0-415-60165-8, 761-768. Yue, ZQ, 2013a. Natural gas eruption mechanism for earthquake landslides: illustrated with comparison between Donghekou and Papandayan Rockslide-debris flows. in Earthquake-induced Landslides, K. Ugai et al. (eds.), Springer-Verlage Berlin, Chapter 51: pp. 485-494 Yue ZQ, 2013b. On incorrectness in elastic rebound theory for cause of earthquakes. Paper No. S20-003 of Session S20, Proceedings of the 13th International Conference on Fracture, June 16-21, Beijing. Yue ZQ, 2013c. On nature of earthquakes with cause of compressed methane gas expansion and migration in crustal rocks, in Proceedings of Fifth Biot Conference on Poromechanics in Memory of Karl von Terzaghi (1883-1963), July 10-12, Vienna, edited by C. Hellmich et al, @ASCE, pp. 507-516.

  15. Recorded motions of the 6 April 2009 Mw 6.3 L'Aquila, Italy, earthquake and implications for building structural damage: Overview

    USGS Publications Warehouse

    Celebi, M.; Bazzurro, P.; Chiaraluce, L.; Clemente, P.; Decanini, L.; Desortis, A.; Ellsworth, W.; Gorini, A.; Kalkan, E.; Marcucci, S.; Milana, G.; Mollaioli, F.; Olivieri, M.; Paolucci, R.; Rinaldis, D.; Rovelli, A.; Sabetta, F.; Stephens, C.

    2010-01-01

    The normal-faulting earthquake of 6 April 2009 in the Abruzzo Region of central Italy caused heavy losses of life and substantial damage to centuriesold buildings of significant cultural importance and to modern reinforcedconcrete- framed buildings with hollow masonry infill walls. Although structural deficiencies were significant and widespread, the study of the characteristics of strong motion data from the heavily affected area indicated that the short duration of strong shaking may have spared many more damaged buildings from collapsing. It is recognized that, with this caveat of shortduration shaking, the infill walls may have played a very important role in preventing further deterioration or collapse of many buildings. It is concluded that better new or retrofit construction practices that include reinforcedconcrete shear walls may prove helpful in reducing risks in such seismic areas of Italy, other Mediterranean countries, and even in United States, where there are large inventories of deficient structures. ?? 2010, Earthquake Engineering Research Institute.

  16. eLearning Hands-On: Blending Interactive eLearning with Practical Engineering Laboratory

    ERIC Educational Resources Information Center

    Kiravu, Cheddi; Yanev, Kamen M.; Tunde, Moses O.; Jeffrey, Anna M.; Schoenian, Dirk; Renner, Ansel

    2016-01-01

    Purpose: Integrating laboratory work into interactive engineering eLearning contents augments theory with practice while simultaneously ameliorating the apparent theory-practice gap in traditional eLearning. The purpose of this paper is to assess and recommend media that currently fulfil this desirable dual pedagogical goal.…

  17. Earthquake location in island arcs

    USGS Publications Warehouse

    Engdahl, E.R.; Dewey, J.W.; Fujita, K.

    1982-01-01

    -velocity lithospheric slab. In application, JHD has the practical advantage that it does not require the specification of a theoretical velocity model for the slab. Considering earthquakes within a 260 km long by 60 km wide section of the Aleutian main thrust zone, our results suggest that the theoretical velocity structure of the slab is presently not sufficiently well known that accurate locations can be obtained independently of locally recorded data. Using a locally recorded earthquake as a calibration event, JHD gave excellent results over the entire section of the main thrust zone here studied, without showing a strong effect that might be attributed to spatially varying source-station anomalies. We also calibrated the ray-tracing method using locally recorded data and obtained results generally similar to those obtained by JHD. ?? 1982.

  18. Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis

    DOE PAGES

    Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders; ...

    2017-09-01

    Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less

  19. Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders

    Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less

  20. Long-term predictability of regions and dates of strong earthquakes

    NASA Astrophysics Data System (ADS)

    Kubyshen, Alexander; Doda, Leonid; Shopin, Sergey

    2016-04-01

    Results on the long-term predictability of strong earthquakes are discussed. It is shown that dates of earthquakes with M>5.5 could be determined in advance of several months before the event. The magnitude and the region of approaching earthquake could be specified in the time-frame of a month before the event. Determination of number of M6+ earthquakes, which are expected to occur during the analyzed year, is performed using the special sequence diagram of seismic activity for the century time frame. Date analysis could be performed with advance of 15-20 years. Data is verified by a monthly sequence diagram of seismic activity. The number of strong earthquakes expected to occur in the analyzed month is determined by several methods having a different prediction horizon. Determination of days of potential earthquakes with M5.5+ is performed using astronomical data. Earthquakes occur on days of oppositions of Solar System planets (arranged in a single line). At that, the strongest earthquakes occur under the location of vector "Sun-Solar System barycenter" in the ecliptic plane. Details of this astronomical multivariate indicator still require further research, but it's practical significant is confirmed by practice. Another one empirical indicator of approaching earthquake M6+ is a synchronous variation of meteorological parameters: abrupt decreasing of minimal daily temperature, increasing of relative humidity, abrupt change of atmospheric pressure (RAMES method). Time difference of predicted and actual date is no more than one day. This indicator is registered 104 days before the earthquake, so it was called as Harmonic 104 or H-104. This fact looks paradoxical, but the works of A. Sytinskiy and V. Bokov on the correlation of global atmospheric circulation and seismic events give a physical basis for this empirical fact. Also, 104 days is a quarter of a Chandler period so this fact gives insight on the correlation between the anomalies of Earth orientation

  1. A Real-Time Earthquake Precursor Detection Technique Using TEC from a GPS Network

    NASA Astrophysics Data System (ADS)

    Alp Akyol, Ali; Arikan, Feza; Arikan, Orhan

    2016-07-01

    Anomalies have been observed in the ionospheric electron density distribution prior to strong earthquakes. However, most of the reported results are obtained by earthquake analysis. Therefore, their implementation in practice is highly problematic. Recently, a novel earthquake precursor detection technique based on spatio-temporal analysis of Total Electron Content (TEC) data obtained from Turkish National Permanent GPS Network (TNPGN) is developed by IONOLAB group (www.ionolab.org). In the present study, the developed detection technique is implemented in a causal setup over the available data set in test phase that enables the real time implementation. The performance of the developed earthquake prediction technique is evaluated by using 10 fold cross validation over the data obtained in 2011. Among the 23 earthquakes that have magnitudes higher than 5, the developed technique can detect precursors of 14 earthquakes while producing 8 false alarms. This study is supported by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR 14/001 projects.

  2. Impact of earthquakes on sex ratio at birth: Eastern Marmara earthquakes

    PubMed Central

    Doğer, Emek; Çakıroğlu, Yiğit; Köpük, Şule Yıldırım; Ceylan, Yasin; Şimşek, Hayal Uzelli; Çalışkan, Eray

    2013-01-01

    Objective: Previous reports suggest that maternal exposure to acute stress related to earthquakes affects the sex ratio at birth. Our aim was to examine the change in sex ratio at birth after Eastern Marmara earthquake disasters. Material and Methods: This study was performed using the official birth statistics from January 1997 to December 2002 – before and after 17 August 1999, the date of the Golcuk Earthquake – supplied from the Turkey Statistics Institute. The secondary sex ratio was expressed as the male proportion at birth, and the ratio of both affected and unaffected areas were calculated and compared on a monthly basis using data from gender with using the Chi-square test. Results: We observed significant decreases in the secondary sex ratio in the 4th and 8th months following an earthquake in the affected region compared to the unaffected region (p= 0.001 and p= 0.024). In the earthquake region, the decrease observed in the secondary sex ratio during the 8th month after an earthquake was specific to the period after the earthquake. Conclusion: Our study indicated a significant reduction in the secondary sex ratio after an earthquake. With these findings, events that cause sudden intense stress such as earthquakes can have an effect on the sex ratio at birth. PMID:24592082

  3. Hazus® estimated annualized earthquake losses for the United States

    USGS Publications Warehouse

    Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean

    2017-01-01

    Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the

  4. Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rikitake, T.

    1979-08-07

    The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less

  5. Cultivation mode research of practical application talents for optical engineering major

    NASA Astrophysics Data System (ADS)

    Liu, Zhiying

    2017-08-01

    The requirements on science and technology graduates are more and higher with modern science progress and society market economy development. Because optical engineering major is with very long practicality, practice should be paid more attention from analysis of optical engineering major and students' foundation. To play role of practice to a large amount, the practice need be systemic and correlation. It should be combination of foundation and profundity. Modern foundation professional knowledge is studied with traditional optical concept and technology at the same time. Systemic regularity and correlation should be embodied in the contents. Start from basic geometrical optics concept, the optical parameter of optical instrument is analyzed, the optical module is built and ray tracing is completed during geometrical optics practice. With foundation of primary aberration calculation, the optical system is further designed and evaluated during optical design practice course. With the optical model and given instrument functions and requirements, the optical-mechanism is matched. The accuracy is calculated, analyzed and distributed in every motion segment. And the mechanism should guarantee the alignment and adjustment. The optical mechanism is designed during the instrument and element design practice. When the optical and mechanism drawings are completed, the system is ready to be fabricated. Students can complete grinding, polishing and coating process by themselves through optical fabricating practice. With the optical and mechanical elements, the system can be assembled and aligned during the thesis practice. With a set of correlated and logical practices, the students can acquire the whole process knowledge about optical instrument. All details are contained in every practice process. These practical experiences provide students working ability. They do not need much adaption anymore when they go to work after graduation. It is favorable to both student

  6. Characteristics of strong motions and damage implications of M S6.5 Ludian earthquake on August 3, 2014

    NASA Astrophysics Data System (ADS)

    Xu, Peibin; Wen, Ruizhi; Wang, Hongwei; Ji, Kun; Ren, Yefei

    2015-02-01

    The Ludian County of Yunnan Province in southwestern China was struck by an M S6.5 earthquake on August 3, 2014, which was another destructive event following the M S8.0 Wenchuan earthquake in 2008, M S7.1 Yushu earthquake in 2010, and M S7.0 Lushan earthquake in 2013. National Strong-Motion Observation Network System of China collected 74 strong motion recordings, which the maximum peak ground acceleration recorded by the 053LLT station in Longtoushan Town was 949 cm/s2 in E-W component. The observed PGAs and spectral ordinates were compared with ground-motion prediction equation in China and the NGA-West2 developed by Pacific Earthquake Engineering Researcher Center. This earthquake is considered as the first case for testing applicability of NGA-West2 in China. Results indicate that the observed PGAs and the 5 % damped pseudo-response spectral accelerations are significantly lower than the predicted ones. The field survey around some typical strong motion stations verified that the earthquake damage was consistent with the official isoseismal by China Earthquake Administration.

  7. Adoption of Requirements Engineering Practices in Malaysian Software Development Companies

    NASA Astrophysics Data System (ADS)

    Solemon, Badariah; Sahibuddin, Shamsul; Ghani, Abdul Azim Abd

    This paper presents exploratory survey results on Requirements Engineering (RE) practices of some software development companies in Malaysia. The survey attempted to identify patterns of RE practices the companies are implementing. Information required for the survey was obtained through a survey, mailed self-administered questionnaires distributed to project managers and software developers who are working at software development companies operated across the country. The results showed that the overall adoption of the RE practices in these companies is strong. However, the results also indicated that fewer companies in the survey have use appropriate CASE tools or software to support their RE process and practices, define traceability policies and maintain traceability manual in their projects.

  8. Tissue-Engineered Solutions in Plastic and Reconstructive Surgery: Principles and Practice

    PubMed Central

    Al-Himdani, Sarah; Jessop, Zita M.; Al-Sabah, Ayesha; Combellack, Emman; Ibrahim, Amel; Doak, Shareen H.; Hart, Andrew M.; Archer, Charles W.; Thornton, Catherine A.; Whitaker, Iain S.

    2017-01-01

    Recent advances in microsurgery, imaging, and transplantation have led to significant refinements in autologous reconstructive options; however, the morbidity of donor sites remains. This would be eliminated by successful clinical translation of tissue-engineered solutions into surgical practice. Plastic surgeons are uniquely placed to be intrinsically involved in the research and development of laboratory engineered tissues and their subsequent use. In this article, we present an overview of the field of tissue engineering, with the practicing plastic surgeon in mind. The Medical Research Council states that regenerative medicine and tissue engineering “holds the promise of revolutionizing patient care in the twenty-first century.” The UK government highlighted regenerative medicine as one of the key eight great technologies in their industrial strategy worthy of significant investment. The long-term aim of successful biomanufacture to repair composite defects depends on interdisciplinary collaboration between cell biologists, material scientists, engineers, and associated medical specialties; however currently, there is a current lack of coordination in the field as a whole. Barriers to translation are deep rooted at the basic science level, manifested by a lack of consensus on the ideal cell source, scaffold, molecular cues, and environment and manufacturing strategy. There is also insufficient understanding of the long-term safety and durability of tissue-engineered constructs. This review aims to highlight that individualized approaches to the field are not adequate, and research collaboratives will be essential to bring together differing areas of expertise to expedite future clinical translation. The use of tissue engineering in reconstructive surgery would result in a paradigm shift but it is important to maintain realistic expectations. It is generally accepted that it takes 20–30 years from the start of basic science research to clinical utility

  9. Tissue-Engineered Solutions in Plastic and Reconstructive Surgery: Principles and Practice.

    PubMed

    Al-Himdani, Sarah; Jessop, Zita M; Al-Sabah, Ayesha; Combellack, Emman; Ibrahim, Amel; Doak, Shareen H; Hart, Andrew M; Archer, Charles W; Thornton, Catherine A; Whitaker, Iain S

    2017-01-01

    Recent advances in microsurgery, imaging, and transplantation have led to significant refinements in autologous reconstructive options; however, the morbidity of donor sites remains. This would be eliminated by successful clinical translation of tissue-engineered solutions into surgical practice. Plastic surgeons are uniquely placed to be intrinsically involved in the research and development of laboratory engineered tissues and their subsequent use. In this article, we present an overview of the field of tissue engineering, with the practicing plastic surgeon in mind. The Medical Research Council states that regenerative medicine and tissue engineering "holds the promise of revolutionizing patient care in the twenty-first century." The UK government highlighted regenerative medicine as one of the key eight great technologies in their industrial strategy worthy of significant investment. The long-term aim of successful biomanufacture to repair composite defects depends on interdisciplinary collaboration between cell biologists, material scientists, engineers, and associated medical specialties; however currently, there is a current lack of coordination in the field as a whole. Barriers to translation are deep rooted at the basic science level, manifested by a lack of consensus on the ideal cell source, scaffold, molecular cues, and environment and manufacturing strategy. There is also insufficient understanding of the long-term safety and durability of tissue-engineered constructs. This review aims to highlight that individualized approaches to the field are not adequate, and research collaboratives will be essential to bring together differing areas of expertise to expedite future clinical translation. The use of tissue engineering in reconstructive surgery would result in a paradigm shift but it is important to maintain realistic expectations. It is generally accepted that it takes 20-30 years from the start of basic science research to clinical utility

  10. Design Practices of Preservice Elementary Teachers in an Integrated Engineering and Literature Experience

    ERIC Educational Resources Information Center

    Wendell, Kristen Bethke

    2014-01-01

    The incorporation of engineering practices and core ideas into the "Next Generation Science Standards" at the elementary school level provides exciting opportunities but also raises important questions about the preparation of new elementary teachers. Both the teacher education and engineering education communities have a limited…

  11. Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake

    USGS Publications Warehouse

    Jones, Lucile M.

    1994-01-01

    The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.

  12. Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data

    NASA Astrophysics Data System (ADS)

    Funning, G. J.; Cockett, R.

    2012-12-01

    InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median

  13. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part B, historical earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.

  14. Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture

    PubMed Central

    Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-01-01

    The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327

  15. Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.

    PubMed

    Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor

    2014-12-02

    The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.

  16. Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project

    USGS Publications Warehouse

    Boyd, Oliver S.

    2012-01-01

    The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.

  17. Teaching Engineering Practices

    ERIC Educational Resources Information Center

    Cunningham, Christine M.; Carlsen, William S.

    2014-01-01

    Engineering is featured prominently in the Next Generation Science Standards (NGSS) and related reform documents, but how its nature and methods are described is problematic. This paper is a systematic review and critique of that representation, and proposes that the disciplinary core ideas of engineering (as described in the NGSS) can be…

  18. "Did you feel it?" Intensity data: A surprisingly good measure of earthquake ground motion

    USGS Publications Warehouse

    Atkinson, G.M.; Wald, D.J.

    2007-01-01

    The U.S. Geological Survey is tapping a vast new source of engineering seismology data through its "Did You Feel It?" (DYFI) program, which collects online citizen responses to earthquakes. To date, more than 750,000 responses have been compiled in the United States alone. The DYFI data make up in quantity what they may lack in scientific quality and offer the potential to resolve longstanding issues in earthquake ground-motion science. Such issues have been difficult to address due to the paucity of instrumental ground-motion data in regions of low seismicity. In particular, DYFI data provide strong evidence that earthquake stress drops, which control the strength of high-frequency ground shaking, are higher in the central and eastern United States (CEUS) than in California. Higher earthquake stress drops, coupled with lower attenuation of shaking with distance, result in stronger overall shaking over a wider area and thus more potential damage for CEUS earthquakes in comparison to those of equal magnitude in California - a fact also definitively captured with these new DYFI data and maps.

  19. Creative Thinking of Practical Engineering Students During a Design Project

    NASA Astrophysics Data System (ADS)

    Waks, Shlomo; Merdler, Moti

    2003-01-01

    Creativity in engineering design had become an economic necessity and not merely the privilege of unique individuals. The search for new, innovative and effective ideas in engineering design stands in center of daily creative performance. This search requires sensitivity to gaps of knowledge and information, and the ability to evoke numerous, different and unique ideas about engineering problems. The source of such information or knowledge can be either extrinsic-such as provided by an instructor or expert or intrinsic, which might involve transformation from one field or context to another. Furthermore, interaction with an exterior source as well as developing an inherent drive, have an impact on the motivation to perform creatively. This article, which is based on a study conducted among Israeli practical engineering students, deals with the variations in creative thinking during various stages of a design project and the relation between creative thinking and motivation factors.

  20. Quantitative Earthquake Prediction on Global and Regional Scales

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir G.

    2006-03-01

    The Earth is a hierarchy of volumes of different size. Driven by planetary convection these volumes are involved into joint and relative movement. The movement is controlled by a wide variety of processes on and around the fractal mesh of boundary zones, and does produce earthquakes. This hierarchy of movable volumes composes a large non-linear dynamical system. Prediction of such a system in a sense of extrapolation of trajectory into the future is futile. However, upon coarse-graining the integral empirical regularities emerge opening possibilities of prediction in a sense of the commonly accepted consensus definition worked out in 1976 by the US National Research Council. Implications of the understanding hierarchical nature of lithosphere and its dynamics based on systematic monitoring and evidence of its unified space-energy similarity at different scales help avoiding basic errors in earthquake prediction claims. They suggest rules and recipes of adequate earthquake prediction classification, comparison and optimization. The approach has already led to the design of reproducible intermediate-term middle-range earthquake prediction technique. Its real-time testing aimed at prediction of the largest earthquakes worldwide has proved beyond any reasonable doubt the effectiveness of practical earthquake forecasting. In the first approximation, the accuracy is about 1-5 years and 5-10 times the anticipated source dimension. Further analysis allows reducing spatial uncertainty down to 1-3 source dimensions, although at a cost of additional failures-to-predict. Despite of limited accuracy a considerable damage could be prevented by timely knowledgeable use of the existing predictions and earthquake prediction strategies. The December 26, 2004 Indian Ocean Disaster seems to be the first indication that the methodology, designed for prediction of M8.0+ earthquakes can be rescaled for prediction of both smaller magnitude earthquakes (e.g., down to M5.5+ in Italy) and

  1. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake

    USGS Publications Warehouse

    Hayes, Gavin P.; Herman, Matthew W.; Barnhart, William D.; Furlong, Kevin P.; Riquelme, Sebástian; Benz, Harley M.; Bergman, Eric; Barrientos, Sergio; Earle, Paul S.; Samsonov, Sergey

    2014-01-01

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile which had not ruptured in a megathrust earthquake since a M ~8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March–April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  2. Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.

    PubMed

    Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey

    2014-08-21

    The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.

  3. Virtual earthquake engineering laboratory with physics-based degrading materials on parallel computers

    NASA Astrophysics Data System (ADS)

    Cho, In Ho

    -scale reinforced concrete (RC) structures under cyclic loading are proposed. Quantitative comparison of state-of-the-art parallel strategies, in terms of factorization, had been carried out, leading to the problem-optimized solver, which is successfully embracing the penalty method and banded nature. Particularly, the penalty method employed imparts considerable smoothness to the global response, which yields a practical superiority of the parallel triangular system solver over other advanced solvers such as parallel preconditioned conjugate gradient method. Other salient issues on parallelization are also addressed. The parallel platform established offers unprecedented access to simulations of real-scale structures, giving new understanding about the physics-based mechanisms adopted and probabilistic randomness at the entire system level. Particularly, the platform enables bold simulations of real-scale RC structures exposed to cyclic loading---H-shaped wall system and 4-story T-shaped wall system. The simulations show the desired capability of accurate prediction of global force-displacement responses, postpeak softening behavior, and compressive buckling of longitudinal steel bars. It is fascinating to see that intrinsic randomness of the 3d interlocking model appears to cause "localized" damage of the real-scale structures, which is consistent with reported observations in different fields such as granular media. Equipped with accuracy, stability and scalability as demonstrated so far, the parallel platform is believed to serve as a fertile ground for the introducing of further physical mechanisms into various research fields as well as the earthquake engineering community. In the near future, it can be further expanded to run in concert with reliable FEA programs such as FRAME3d or OPENSEES. Following the central notion of "multiscale" analysis technique, actual infrastructures exposed to extreme natural hazard can be successfully tackled by this next generation analysis

  4. Earthquakes and depleted gas reservoirs: which comes first?

    NASA Astrophysics Data System (ADS)

    Mucciarelli, M.; Donda, F.; Valensise, G.

    2015-10-01

    While scientists are paying increasing attention to the seismicity potentially induced by hydrocarbon exploitation, so far, little is known about the reverse problem, i.e. the impact of active faulting and earthquakes on hydrocarbon reservoirs. The 20 and 29 May 2012 earthquakes in Emilia, northern Italy (Mw 6.1 and 6.0), raised concerns among the public for being possibly human-induced, but also shed light on the possible use of gas wells as a marker of the seismogenic potential of an active fold and thrust belt. We compared the location, depth and production history of 455 gas wells drilled along the Ferrara-Romagna arc, a large hydrocarbon reserve in the southeastern Po Plain (northern Italy), with the location of the inferred surface projection of the causative faults of the 2012 Emilia earthquakes and of two pre-instrumental damaging earthquakes. We found that these earthquake sources fall within a cluster of sterile wells, surrounded by productive wells at a few kilometres' distance. Since the geology of the productive and sterile areas is quite similar, we suggest that past earthquakes caused the loss of all natural gas from the potential reservoirs lying above their causative faults. To validate our hypothesis we performed two different statistical tests (binomial and Monte Carlo) on the relative distribution of productive and sterile wells, with respect to seismogenic faults. Our findings have important practical implications: (1) they may allow major seismogenic sources to be singled out within large active thrust systems; (2) they suggest that reservoirs hosted in smaller anticlines are more likely to be intact; and (3) they also suggest that in order to minimize the hazard of triggering significant earthquakes, all new gas storage facilities should use exploited reservoirs rather than sterile hydrocarbon traps or aquifers.

  5. Earthquakes: Predicting the unpredictable?

    USGS Publications Warehouse

    Hough, Susan E.

    2005-01-01

    The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.

  6. Elementary Education Program for Engineering by Dual System of Workshop and Teaching Program with Practical Subject

    NASA Astrophysics Data System (ADS)

    Hara, Toshitsugu

    Elementary education program for engineering by the dual system combined with workshop program and teaching program with practical subject was discussed. The dual system which consists of several workshop programs and fundamental subjects (such as mathematics, English and physics) with practical material has been performed for the freshmen. The elementary workshop program (primary course) has four workshops and the related lectures. Fundamental subjects are taught with the practical or engineering texts. English subjects are taught by specified teachers who have ever worked in engineering field with English. The dual system was supported by such systems as the center for success initiative and the English education center.

  7. Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.

    PubMed

    Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi

    2012-01-01

    Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.

  8. Method meets application: on the use of earthquake scenarios in community-based disaster preparedness and response

    NASA Astrophysics Data System (ADS)

    Sargeant, S.; Sorensen, M. B.

    2011-12-01

    More than 50% of the world's population now live in urban areas. In less developed countries, future urban population increase will be due to natural population growth and rural-to-urban migration. As urban growth continues, the vulnerability of those living in these areas is also increasing. This presents a wide variety of challenges for humanitarian organisations that often have more experience of disaster response in rural settings rather than planning for large urban disasters. The 2010 Haiti earthquake highlighted the vulnerability of these organisations and the communities that they seek to support. To meet this challenge, a key consideration is how scientific information can support the humanitarian sector and their working practices. Here we review the current state of earthquake scenario modelling practice, with special focus on scenarios to be used in disaster response and response planning, and present an evaluation of how the field looks set to evolve. We also review current good practice and lessons learned from previous earthquakes with respect to planning for and responding to earthquakes in urban settings in the humanitarian sector, identifying key sectoral priorities. We then investigate the interface between these two areas to investigate the use of earthquake scenarios in disaster response planning and identify potential challenges both with respect to development of scientific models and their application on the ground.

  9. Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines

    USGS Publications Warehouse

    Schiff, Anshel J.

    1998-01-01

    To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.

  10. Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3

    NASA Astrophysics Data System (ADS)

    Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.

    2017-12-01

    Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.

  11. PAGER-CAT: A composite earthquake catalog for calibrating global fatality models

    USGS Publications Warehouse

    Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.

    2009-01-01

    highly uncertain, particularly the casualty numbers, which must be regarded as estimates rather than firm numbers for many earthquakes. Consequently, we encourage contributions from the seismology and earthquake engineering communities to further improve this resource via the Wikipedia page and personal communications, for the benefit of the whole community.

  12. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, Charles E.; Piascik, Robert S.; Newman, James C., Jr.

    1999-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  13. A Practical Engineering Approach to Predicting Fatigue Crack Growth in Riveted Lap Joints

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Piascik, R. S.; Newman, J. C., Jr.

    2000-01-01

    An extensive experimental database has been assembled from very detailed teardown examinations of fatigue cracks found in rivet holes of fuselage structural components. Based on this experimental database, a comprehensive analysis methodology was developed to predict the onset of widespread fatigue damage in lap joints of fuselage structure. Several computer codes were developed with specialized capabilities to conduct the various analyses that make up the comprehensive methodology. Over the past several years, the authors have interrogated various aspects of the analysis methods to determine the degree of computational rigor required to produce numerical predictions with acceptable engineering accuracy. This study led to the formulation of a practical engineering approach to predicting fatigue crack growth in riveted lap joints. This paper describes the practical engineering approach and compares predictions with the results from several experimental studies.

  14. 1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska

    USGS Publications Warehouse

    Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.

    2014-01-01

    , and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.

  15. Global earthquake casualties due to secondary effects: A quantitative analysis for improving PAGER losses

    USGS Publications Warehouse

    Wald, David J.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.

  16. Injection-induced earthquakes

    USGS Publications Warehouse

    Ellsworth, William L.

    2013-01-01

    Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.

  17. Earthquakes; January-February 1982

    USGS Publications Warehouse

    Person, W.J.

    1982-01-01

    In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine. 

  18. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California.

    PubMed

    Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F

    2011-10-04

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.

  19. Results of the Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California

    PubMed Central

    Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.

    2011-01-01

    The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355

  20. Teaching Bioprocess Engineering to Undergraduates: Multidisciplinary Hands-On Training in a One-Week Practical Course

    ERIC Educational Resources Information Center

    Henkel, Marius; Zwick, Michaela; Beuker, Janina; Willenbacher, Judit; Baumann, Sandra; Oswald, Florian; Neumann, Anke; Siemann-Herzberg, Martin; Syldatk, Christoph; Hausmann, Rudolf

    2015-01-01

    Bioprocess engineering is a highly interdisciplinary field of study which is strongly benefited by practical courses where students can actively experience the interconnection between biology, engineering, and physical sciences. This work describes a lab course developed for 2nd year undergraduate students of bioprocess engineering and related…

  1. Near real-time aftershock hazard maps for earthquakes

    NASA Astrophysics Data System (ADS)

    McCloskey, J.; Nalbant, S. S.

    2009-04-01

    Stress interaction modelling is routinely used to explain the spatial relationships between earthquakes and their aftershocks. On 28 October 2008 a M6.4 earthquake occurred near the Pakistan-Afghanistan border killing several hundred and causing widespread devastation. A second M6.4 event occurred 12 hours later 20km to the south east. By making some well supported assumptions concerning the source event and the geometry of any likely triggered event it was possible to map those areas most likely to experience further activity. Using Google earth, it would further have been possible to identify particular settlements in the source area which were particularly at risk and to publish their locations globally within about 3 hours of the first earthquake. Such actions could have significantly focused the initial emergency response management. We argue for routine prospective testing of such forecasts and dialogue between social and physical scientists and emergency response professionals around the practical application of these techniques.

  2. Induced earthquake during the 2016 Kumamoto earthquake (Mw7.0): Importance of real-time shake monitoring for Earthquake Early Warning

    NASA Astrophysics Data System (ADS)

    Hoshiba, M.; Ogiso, M.

    2016-12-01

    Sequence of the 2016 Kumamoto earthquakes (Mw6.2 on April 14, Mw7.0 on April 16, and many aftershocks) caused a devastating damage at Kumamoto and Oita prefectures, Japan. During the Mw7.0 event, just after the direct S waves passing the central Oita, another M6 class event occurred there more than 80 km apart from the Mw7.0 event. The M6 event is interpreted as an induced earthquake; but it brought stronger shaking at the central Oita than that from the Mw7.0 event. We will discuss the induced earthquake from viewpoint of Earthquake Early Warning. In terms of ground shaking such as PGA and PGV, the Mw7.0 event is much smaller than those of the M6 induced earthquake at the central Oita (for example, 1/8 smaller at OIT009 station for PGA), and then it is easy to discriminate two events. However, PGD of the Mw7.0 is larger than that of the induced earthquake, and its appearance is just before the occurrence of the induced earthquake. It is quite difficult to recognize the induced earthquake from displacement waveforms only, because the displacement is strongly contaminated by that of the preceding Mw7.0 event. In many methods of EEW (including current JMA EEW system), magnitude is used for prediction of ground shaking through Ground Motion Prediction Equation (GMPE) and the magnitude is often estimated from displacement. However, displacement magnitude does not necessarily mean the best one for prediction of ground shaking, such as PGA and PGV. In case of the induced earthquake during the Kumamoto earthquake, displacement magnitude could not be estimated because of the strong contamination. Actually JMA EEW system could not recognize the induced earthquake. One of the important lessons we learned from eight years' operation of EEW is an issue of the multiple simultaneous earthquakes, such as aftershocks of the 2011 Mw9.0 Tohoku earthquake. Based on this lesson, we have proposed enhancement of real-time monitor of ground shaking itself instead of rapid estimation of

  3. Structural engineering masters level education framework of knowledge for the needs of initial professional practice

    NASA Astrophysics Data System (ADS)

    Balogh, Zsuzsa Enriko

    For at least the last decade, engineering, civil engineering, along with structural engineering as a profession within civil engineering, have and continue to face an emerging need for "Raising the Bar" of preparedness of young engineers seeking to become practicing professional engineers. The present consensus of the civil engineering profession is that the increasing need for broad and in-depth knowledge should require the young structural engineers to have at least a Masters-Level education. This study focuses on the Masters-Level preparedness in the structural engineering area within the civil engineering field. It follows much of the methodology used in the American Society of Civil Engineers (ASCE) Body of Knowledge determination for civil engineering and extends this type of study to better define the portion of the young engineers preparation beyond the undergraduate program for one specialty area of civil engineering. The objective of this research was to create a Framework of Knowledge for the young engineer which identifies and recognizes the needs of the profession, along with the profession's expectations of how those needs can be achieved in the graduate-level academic setting, in the practice environment, and through lifelong learning opportunities with an emphasis on the initial five years experience past completion of a Masters program in structural engineering. This study applied a modified Delphi method to obtain the critical information from members of the structural engineering profession. The results provide a Framework of Knowledge which will be useful to several groups seeking to better ensure the preparedness of the future young structural engineers at the Masters-Level.

  4. Earthquakes, September-October 1986

    USGS Publications Warehouse

    Person, W.J.

    1987-01-01

    There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.

  5. Elementary teachers' mental models of engineering design processes: A comparison of two communities of practice

    NASA Astrophysics Data System (ADS)

    McMahon, Ann P.

    Educating K-12 students in the processes of design engineering is gaining popularity in public schools. Several states have adopted standards for engineering design despite the fact that no common agreement exists on what should be included in the K-12 engineering design process. Furthermore, little pre-service and in-service professional development exists that will prepare teachers to teach a design process that is fundamentally different from the science teaching process found in typical public schools. This study provides a glimpse into what teachers think happens in engineering design compared to articulated best practices in engineering design. Wenger's communities of practice work and van Dijk's multidisciplinary theory of mental models provide the theoretical bases for comparing the mental models of two groups of elementary teachers (one group that teaches engineering and one that does not) to the mental models of design engineers (including this engineer/researcher/educator and professionals described elsewhere). The elementary school teachers and this engineer/researcher/educator observed the design engineering process enacted by professionals, then answered questions designed to elicit their mental models of the process they saw in terms of how they would teach it to elementary students. The key finding is this: Both groups of teachers embedded the cognitive steps of the design process into the matrix of the social and emotional roles and skills of students. Conversely, the engineers embedded the social and emotional aspects of the design process into the matrix of the cognitive steps of the design process. In other words, teachers' mental models show that they perceive that students' social and emotional communicative roles and skills in the classroom drive their cognitive understandings of the engineering process, while the mental models of this engineer/researcher/educator and the engineers in the video show that we perceive that cognitive

  6. Post earthquake recovery in natural gas systems--1971 San Fernando Earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, W.T. Jr.

    1983-01-01

    In this paper a concise summary of the post earthquake investigations for the 1971 San Fernando Earthquake is presented. The effects of the earthquake upon building and other above ground structures are briefly discussed. Then the damages and subsequent repairs in the natural gas systems are reported.

  7. The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan

    NASA Astrophysics Data System (ADS)

    Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.

    2011-12-01

    faults in Taiwan. By accomplishing active fault parameters table in Taiwan, we would apply it in time-dependent earthquake hazard assessment. The result can also give engineers a reference for design. Furthermore, it can be applied in the seismic hazard map to mitigate disasters.

  8. Earthquakes; July-August, 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California. 

  9. Earthquake Hazard Assessment: Basics of Evaluation

    NASA Astrophysics Data System (ADS)

    Kossobokov, Vladimir

    2016-04-01

    Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of

  10. Improvements of the offshore earthquake locations in the Earthquake Early Warning System

    NASA Astrophysics Data System (ADS)

    Chen, Ta-Yi; Hsu, Hsin-Chih

    2017-04-01

    Since 2014 the Earthworm Based Earthquake Alarm Reporting (eBEAR) system has been operated and been used to issue warnings to schools. In 2015 the system started to provide warnings to the public in Taiwan via television and the cell phone. Online performance of the eBEAR system indicated that the average reporting times afforded by the system are approximately 15 and 28 s for inland and offshore earthquakes, respectively. The eBEAR system in average can provide more warning time than the current EEW system (3.2 s and 5.5 s for inland and offshore earthquakes, respectively). However, offshore earthquakes were usually located poorly because only P-wave arrivals were used in the eBEAR system. Additionally, in the early stage of the earthquake early warning system, only fewer stations are available. The poor station coverage may be a reason to answer why offshore earthquakes are difficult to locate accurately. In the Geiger's inversion procedure of earthquake location, we need to put an initial hypocenter and origin time into the location program. For the initial hypocenter, we defined some test locations on the offshore area instead of using the average of locations from triggered stations. We performed 20 programs concurrently running the Geiger's method with different pre-defined initial position to locate earthquakes. We assume that if the program with the pre-defined initial position is close to the true earthquake location, during the iteration procedure of the Geiger's method the processing time of this program should be less than others. The results show that using pre-defined locations for trial-hypocenter in the inversion procedure is able to improve the accurate of offshore earthquakes. Especially for EEW system, in the initial stage of the EEW system, only use 3 or 5 stations to locate earthquakes may lead to bad results because of poor station coverage. In this study, the pre-defined trial-locations provide a feasible way to improve the estimations of

  11. Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?

    NASA Astrophysics Data System (ADS)

    Madariaga, R.

    2013-05-01

    The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11

  12. Performance of San Fernando dams during 1994 Northridge earthquake

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardet, J.P.; Davis, C.A.

    1996-07-01

    The 1994 Northridge and 1971 San Fernando Earthquakes subjected the Lower and Upper San Fernando Dams of the Van Norman Complex in the San Fernando Valley, Calif., to strong near-source ground motions. In 1994, these earth dams, which were out of service and retained only a few meters of water, extensively cracked and settled due to the liquefaction of their hydraulic fill. The Lower San Fernando Dam moved over 15 cm upstream as the hydraulic fill liquefied beneath its upstream slope. The Upper San Fernando Dam moved even more and deformed in a complicated three-dimensional pattern. The responses of themore » Lower and Upper San Fernando Dams during the 1994 Northridge Earthquake, although less significant than in 1971, provide the geotechnical engineering community with two useful case histories.« less

  13. Seismogeodetic monitoring techniques for tsunami and earthquake early warning and rapid assessment of structural damage

    NASA Astrophysics Data System (ADS)

    Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.

    2016-12-01

    As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June

  14. Permeability, storage and hydraulic diffusivity controlled by earthquakes

    NASA Astrophysics Data System (ADS)

    Brodsky, E. E.; Fulton, P. M.; Xue, L.

    2016-12-01

    Earthquakes can increase permeability in fractured rocks. In the farfield, such permeability increases are attributed to seismic waves and can last for months after the initial earthquake. Laboratory studies suggest that unclogging of fractures by the transient flow driven by seismic waves is a viable mechanism. These dynamic permeability increases may contribute to permeability enhancement in the seismic clouds accompanying hydraulic fracking. Permeability enhancement by seismic waves could potentially be engineered and the experiments suggest the process will be most effective at a preferred frequency. We have recently observed similar processes inside active fault zones after major earthquakes. A borehole observatory in the fault that generated the M9.0 2011 Tohoku earthquake reveals a sequence of temperature pulses during the secondary aftershock sequence of an M7.3 aftershock. The pulses are attributed to fluid advection by a flow through a zone of transiently increased permeability. Directly after the M7.3 earthquake, the newly damaged fault zone is highly susceptible to further permeability enhancement, but ultimately heals within a month and becomes no longer as sensitive. The observation suggests that the newly damaged fault zone is more prone to fluid pulsing than would be expected based on the long-term permeability structure. Even longer term healing is seen inside the fault zone of the 2008 M7.9 Wenchuan earthquake. The competition between damage and healing (or clogging and unclogging) results in dynamically controlled permeability, storage and hydraulic diffusivity. Recent measurements of in situ fault zone architecture at the 1-10 meter scale suggest that active fault zones often have hydraulic diffusivities near 10-2 m2/s. This uniformity is true even within the damage zone of the San Andreas fault where permeability and storage increases balance each other to achieve this value of diffusivity over a 400 m wide region. We speculate that fault zones

  15. Earthquakes; March-April 1975

    USGS Publications Warehouse

    Person, W.J.

    1975-01-01

    There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971. 

  16. Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    METCALF, I.L.

    1999-12-06

    This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces.

  17. Training Program for Practical Engineering Design through the Collaboration with Regional Companies

    NASA Astrophysics Data System (ADS)

    Gofuku, Akio; Tabata, Nobuhisa; Tomita, Eiji; Funabiki, Nobuo

    An education program to bring up engineering design capabilities through long-term internship by the collaboration with regional companies has been put in practice for five years. The program is composed of two types of long-term internships and several lectures for patent systems and engineering ethics. This paper describes the outline of the program, educational effects, and our experiences. The program was improved into two educational programs in 2011. The one is a special course to educate engineers and scientists who can lead the technologies of their domains. The other is a long-term internship program for master students in engineering divisions of graduate school. This paper also describes the current activities of the latter program.

  18. Events | Pacific Earthquake Engineering Research Center

    Science.gov Websites

    home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links Events Calendar of PEER and Other Events PEER Events Archive PEER Annual Meeting 2009 Experimental Structural Engineering PEER Summative Meeting Site Map Search Calendar of PEER and Other Events

  19. Protecting your family from earthquakes: The seven steps to earthquake safety

    USGS Publications Warehouse

    Developed by American Red Cross, Asian Pacific Fund

    2007-01-01

    This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48

  20. Directivity in NGA earthquake ground motions: Analysis using isochrone theory

    USGS Publications Warehouse

    Spudich, P.; Chiou, B.S.J.

    2008-01-01

    We present correction factors that may be applied to the ground motion prediction relations of Abrahamson and Silva, Boore and Atkinson, Campbell and Bozorgnia, and Chiou and Youngs (all in this volume) to model the azimuthally varying distribution of the GMRotI50 component of ground motion (commonly called 'directivity') around earthquakes. Our correction factors may be used for planar or nonplanar faults having any dip or slip rake (faulting mechanism). Our correction factors predict directivity-induced variations of spectral acceleration that are roughly half of the strike-slip variations predicted by Somerville et al. (1997), and use of our factors reduces record-to-record sigma by about 2-20% at 5 sec or greater period. ?? 2008, Earthquake Engineering Research Institute.

  1. Earthquake Triggering in the September 2017 Mexican Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.

    2017-12-01

    Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress

  2. Earthquake catalog for estimation of maximum earthquake magnitude, Central and Eastern United States: Part A, Prehistoric earthquakes

    USGS Publications Warehouse

    Wheeler, Russell L.

    2014-01-01

    Computation of probabilistic earthquake hazard requires an estimate of Mmax, the maximum earthquake magnitude thought to be possible within a specified geographic region. This report is Part A of an Open-File Report that describes the construction of a global catalog of moderate to large earthquakes, from which one can estimate Mmax for most of the Central and Eastern United States and adjacent Canada. The catalog and Mmax estimates derived from it were used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. This Part A discusses prehistoric earthquakes that occurred in eastern North America, northwestern Europe, and Australia, whereas a separate Part B deals with historical events.

  3. The Critical Incident Technique: An Effective Tool for Gathering Experience from Practicing Engineers

    ERIC Educational Resources Information Center

    Hanson, James H.; Brophy, Patrick D.

    2012-01-01

    Not all knowledge and skills that educators want to pass to students exists yet in textbooks. Some still resides only in the experiences of practicing engineers (e.g., how engineers create new products, how designers identify errors in calculations). The critical incident technique, CIT, is an established method for cognitive task analysis. It is…

  4. The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?

    USGS Publications Warehouse

    Kilb, Debi; Gomberg, J.

    1999-01-01

    We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.

  5. The Canterbury Tales: Lessons from the Canterbury Earthquake Sequence to Inform Better Public Communication Models

    NASA Astrophysics Data System (ADS)

    McBride, S.; Tilley, E. N.; Johnston, D. M.; Becker, J.; Orchiston, C.

    2015-12-01

    This research evaluates the public education earthquake information prior to the Canterbury Earthquake sequence (2010-present), and examines communication learnings to create recommendations for improvement in implementation for these types of campaigns in future. The research comes from a practitioner perspective of someone who worked on these campaigns in Canterbury prior to the Earthquake Sequence and who also was the Public Information Manager Second in Command during the earthquake response in February 2011. Documents, specifically those addressing seismic risk, that were created prior to the earthquake sequence, were analyzed, using a "best practice matrix" created by the researcher, for how closely these aligned to best practice academic research. Readability tests and word counts are also employed to assist with triangulation of the data as was practitioner involvement. This research also outlines the lessons learned by practitioners and explores their experiences in regards to creating these materials and how they perceive these now, given all that has happened since the inception of the booklets. The findings from the research showed these documents lacked many of the attributes of best practice. The overly long, jargon filled text had little positive outcome expectancy messages. This probably would have failed to persuade anyone that earthquakes were a real threat in Canterbury. Paradoxically, it is likely these booklets may have created fatalism in publics who read the booklets. While the overall intention was positive, for scientists to explain earthquakes, tsunami, landslides and other risks to encourage the public to prepare for these events, the implementation could be greatly improved. This final component of the research highlights points of improvement for implementation for more successful campaigns in future. The importance of preparedness and science information campaigns can be not only in preparing the population but also into development of

  6. Engineering and socioeconomic impacts of earthquakes: An analysis of electricity lifeline disruptions in the New Madrid area

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shinozuka, M.; Rose, A.; Eguchi, R.T.

    1998-12-31

    This monograph examines the potential effects of a repeat of the New Madrid earthquake to the metropolitan Memphis area. The authors developed a case study of the impact of such an event to the electric power system, and analyzed how this disruption would affect society. In nine chapters and 189 pages, the book traces the impacts of catastrophic earthquakes through a curtailment of utility lifeline services to its host regional economy and beyond. the monographs` chapters include: Modeling the Memphis economy; seismic performance of electric power systems; spatial analysis techniques for linking physical damage to economic functions; earthquake vulnerability andmore » emergency preparedness among businesses; direct economic impacts; regional economic impacts; socioeconomic and interregional impacts; lifeline risk reduction; and public policy formulation and implementation.« less

  7. Identification of Deep Earthquakes

    DTIC Science & Technology

    2010-09-01

    discriminants that will reliably separate small, crustal earthquakes (magnitudes less than about 4 and depths less than about 40 to 50 km) from small...characteristics on discrimination plots designed to separate nuclear explosions from crustal earthquakes. Thus, reliably flagging these small, deep events is...Further, reliably identifying subcrustal earthquakes will allow us to eliminate deep events (previously misidentified as crustal earthquakes) from

  8. Oregon Hazard Explorer for Lifelines Program (OHELP): A web-based geographic information system tool for assessing potential Cascadia earthquake hazard

    NASA Astrophysics Data System (ADS)

    Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.

    2016-12-01

    The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.

  9. Practice-Relevant Pedagogy for Mining Software Engineering Curricula Assets

    DTIC Science & Technology

    2007-06-20

    permits the application of the Lean methods by virtually grouping shared services into eWorkcenters to which only non-routine requests are routed...engineering can be applied to IT shared services improvement and provide precise system improvement methods to complement the ITIL best practice. This...Vertical� or internal service- chain of primary business functions and enabling shared services Framework results - Mined patterns that relate

  10. The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?

    PubMed

    Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G

    1986-07-25

    A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.

  11. Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.

    DTIC Science & Technology

    1983-09-01

    research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis

  12. The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault

    USGS Publications Warehouse

    Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.

    2011-01-01

    In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.

  13. Practice on Upbringing Young Engineers Collaborated with Local Enterprises

    NASA Astrophysics Data System (ADS)

    Hiraki, Yutaka; Uno, Naotsugu; Tanaka, Yuichi; Iyama, Hirofumi; Yamashita, Toru; Miyamoto, Noritaka

    The ministry of Economics and Industry started the project collaborated with National Colleges of Technology titled “Upbringing Young Engineers in small and medium-sized enterprises” , in 2006. In our college, the authors planed the upbringing-program for the die-cast engineer in automobile industries collaborated with several enterprises in neighboring area and applied for the project. The program was adopted and worked out the concrete curriculum for the first year. The curriculum contains the training of the base of mechanical design with 3D-CAD/CAE/CAM systems and the practical training on manufacturing, by means of Problem Based Learning method. The program carried out in September and finished in December successfully. This paper reports the outline of the curriculum and the results in the program.

  14. Effect of water content on stability of landslides triggered by earthquakes

    NASA Astrophysics Data System (ADS)

    Beyabanaki, S.; Bagtzoglou, A. C.; Anagnostou, E. N.

    2013-12-01

    during rainfall is investigated. In this study, after different durations of rainfall, an earthquake is applied to the model and the elapsed time in which the FS gets less than one obtains by trial and error. The results for different initial water contents and earthquake acceleration coefficients show that landslides can happen after shorter rainfall duration when water content is greater. If water content is high enough, the landslide occurs even without rainfall. References [1] Ray RL, Jacobs JM, de Alba P. Impact of unsaturated zone soil moisture and groundwater table on slope instability. J. Geotech. Geoenviron. Eng., 2010, 136(10):1448-1458. [2] Das B. Principles of Foundation Engineering. Stanford, Cengage Learning, 2011. Fig. 1. Effect of initial water content on FS for different EACs

  15. An investigation into the socioeconomic aspects of two major earthquakes in Iran.

    PubMed

    Amini Hosseini, Kambod; Hosseinioon, Solmaz; Pooyan, Zhila

    2013-07-01

    An evaluation of the socioeconomic consequences of earthquakes is an essential part of the development of risk reduction and disaster management plans. However, these variables are not normally addressed sufficiently after strong earthquakes; researchers and relevant stakeholders focus primarily on the physical damage and casualties. The importance of the socioeconomic consequences of seismic events became clearer in Iran after the Bam earthquake on 26 December 2003, as demonstrated by the formulation and approval of various laws and ordinances. This paper reviews the country's regulatory framework in the light of the socioeconomic aspects of two major and destructive earthquakes: in Manjil-Rudbar in 1990, and in Bam in 2003. The results take the form of recommendations and practical strategies for incorporating the socioeconomic dimensions of earthquakes in disaster risk management planning. The results presented here can be applied in other countries with similar conditions to those of Iran in order to improve public preparedness and risk reduction. © 2013 The Author(s). Journal compilation © Overseas Development Institute, 2013.

  16. Using Modified Mercalli Intensities to estimate acceleration response spectra for the 1906 San Francisco earthquake

    USGS Publications Warehouse

    Boatwright, J.; Bundock, H.; Seekins, L.C.

    2006-01-01

    We derive and test relations between the Modified Mercalli Intensity (MMI) and the pseudo-acceleration response spectra at 1.0 and 0.3 s - SA(1.0 s) and SA(0.3 s) - in order to map response spectral ordinates for the 1906 San Francisco earthquake. Recent analyses of intensity have shown that MMI ??? 6 correlates both with peak ground velocity and with response spectra for periods from 0.5 to 3.0 s. We use these recent results to derive a linear relation between MMI and log SA(1.0 s), and we refine this relation by comparing the SA(1.0 s) estimated from Boatwright and Bundock's (2005) MMI map for the 1906 earthquake to the SA(1.0 s) calculated from recordings of the 1989 Loma Prieta earthquake. South of San Jose, the intensity distributions for the 1906 and 1989 earthquakes are remarkably similar, despite the difference in magnitude and rupture extent between the two events. We use recent strong motion regressions to derive a relation between SA(1.0 s) and SA(0.3 s) for a M7.8 strike-slip earthquake that depends on soil type, acceleration level, and source distance. We test this relation by comparing SA(0.3 s) estimated for the 1906 earthquake to SA(0.3 s) calculated from recordings of both the 1989 Loma Prieta and 1994 Northridge earthquakes, as functions of distance from the fault. ?? 2006, Earthquake Engineering Research Institute.

  17. Global earthquake casualties due to secondary effects: A quantitative analysis for improving rapid loss analyses

    USGS Publications Warehouse

    Marano, K.D.; Wald, D.J.; Allen, T.I.

    2010-01-01

    This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.

  18. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  19. Loss Estimations due to Earthquakes and Secondary Technological Hazards

    NASA Astrophysics Data System (ADS)

    Frolova, N.; Larionov, V.; Bonnin, J.

    2009-04-01

    Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.

  20. Satellite-instrument system engineering best practices and lessons

    NASA Astrophysics Data System (ADS)

    Schueler, Carl F.

    2009-08-01

    This paper focuses on system engineering development issues driving satellite remote sensing instrumentation cost and schedule. A key best practice is early assessment of mission and instrumentation requirements priorities driving performance trades among major instrumentation measurements: Radiometry, spatial field of view and image quality, and spectral performance. Key lessons include attention to technology availability and applicability to prioritized requirements, care in applying heritage, approaching fixed-price and cost-plus contracts with appropriate attention to risk, and assessing design options with attention to customer preference as well as design performance, and development cost and schedule. A key element of success either in contract competition or execution is team experience. Perhaps the most crucial aspect of success, however, is thorough requirements analysis and flowdown to specifications driving design performance with sufficient parameter margin to allow for mistakes or oversights - the province of system engineering from design inception to development, test and delivery.

  1. The relationship between earthquake exposure and posttraumatic stress disorder in 2013 Lushan earthquake

    NASA Astrophysics Data System (ADS)

    Wang, Yan; Lu, Yi

    2018-01-01

    The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.

  2. Crowdsourced earthquake early warning.

    PubMed

    Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L

    2015-04-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  3. Crowdsourced earthquake early warning

    PubMed Central

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167

  4. Earthquakes, November-December 1992

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California. 

  5. Crowdsourced earthquake early warning

    USGS Publications Warehouse

    Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.

    2015-01-01

    Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.

  6. Natural Hazard Public Policy Implications of the May 12, 2008 M7.9 Wenchuan Earthquake, Sichuan, China

    NASA Astrophysics Data System (ADS)

    Cydzik, K.; Hamilton, D.; Stenner, H. D.; Cattarossi, A.; Shrestha, P. L.

    2009-12-01

    The May 12, 2008 M7.9 Wenchuan Earthquake in Sichuan Province, China killed almost 90,000 people and affected a population of over 45.5 million throughout western China. Shaking caused the destruction of five million buildings, many of them homes and schools, and damaged 21 million other structures, inflicting devastating impacts to communities. Landslides, a secondary effect of the shaking, caused much of the devastation. Debris flows buried schools and homes, rock falls crushed cars, and rockslides, landslides, and rock avalanches blocked streams and rivers creating massive, unstable landslide dams, which formed “quake lakes” upstream of the blockages. Impassable roads made emergency access slow and extremely difficult. Collapses of buildings and structures large and small took the lives of many. Damage to infrastructure impaired communication, cut off water supplies and electricity, and put authorities on high alert as the integrity of large engineered dams were reviewed. During our field reconnaissance three months after the disaster, evidence of the extent of the tragedy was undeniably apparent. Observing the damage throughout Sichuan reminded us that earthquakes in the United States and throughout the world routinely cause widespread damage and destruction to lives, property, and infrastructure. The focus of this poster is to present observations and findings based on our field reconnaissance regarding the scale of earthquake destruction with respect to slope failures, landslide dams, damage to infrastructure (e.g., schools, engineered dams, buildings, roads, rail lines, and water resources facilities), human habitation within the region, and the mitigation and response effort to this catastrophe. This is presented in the context of the policy measures that could be developed to reduce risks of similar catastrophes. The rapid response of the Chinese government and the mobilization of the Chinese People’s Liberation Army to help the communities affected

  7. Effects of Fault Segmentation, Mechanical Interaction, and Structural Complexity on Earthquake-Generated Deformation

    ERIC Educational Resources Information Center

    Haddad, David Elias

    2014-01-01

    Earth's topographic surface forms an interface across which the geodynamic and geomorphic engines interact. This interaction is best observed along crustal margins where topography is created by active faulting and sculpted by geomorphic processes. Crustal deformation manifests as earthquakes at centennial to millennial timescales. Given that…

  8. Potentially induced earthquakes during the early twentieth century in the Los Angeles Basin

    USGS Publications Warehouse

    Hough, Susan E.; Page, Morgan T.

    2016-01-01

    Recent studies have presented evidence that early to mid‐twentieth‐century earthquakes in Oklahoma and Texas were likely induced by fossil fuel production and/or injection of wastewater (Hough and Page, 2015; Frohlich et al., 2016). Considering seismicity from 1935 onward, Hauksson et al. (2015) concluded that there is no evidence for significant induced activity in the greater Los Angeles region between 1935 and the present. To explore a possible association between earthquakes prior to 1935 and oil and gas production, we first revisit the historical catalog and then review contemporary oil industry activities. Although early industry activities did not induce large numbers of earthquakes, we present evidence for an association between the initial oil boom in the greater Los Angeles area and earthquakes between 1915 and 1932, including the damaging 22 June 1920 Inglewood and 8 July 1929 Whittier earthquakes. We further consider whether the 1933 Mw 6.4 Long Beach earthquake might have been induced, and show some evidence that points to a causative relationship between the earthquake and activities in the Huntington Beach oil field. The hypothesis that the Long Beach earthquake was either induced or triggered by an foreshock cannot be ruled out. Our results suggest that significant earthquakes in southern California during the early twentieth century might have been associated with industry practices that are no longer employed (i.e., production without water reinjection), and do not necessarily imply a high likelihood of induced earthquakes at the present time.

  9. Earthquake Clusters and Spatio-temporal Migration of earthquakes in Northeastern Tibetan Plateau: a Finite Element Modeling

    NASA Astrophysics Data System (ADS)

    Sun, Y.; Luo, G.

    2017-12-01

    Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.

  10. Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Marina District

    USGS Publications Warehouse

    O'Rourke, Thomas D.

    1992-01-01

    During the earthquake, a total land area of about 4,300 km2 was shaken with seismic intensities that can cause significant damage to structures. The area of the Marina District of San Francisco is only 4.0 km2--less than 0.1 percent of the area most strongly affected by the earthquake--but its significance with respect to engineering, seismology, and planning far outstrips its proportion of shaken terrain and makes it a centerpiece for lessons learned from the earthquake. The Marina District provides perhaps the most comprehensive case history of seismic effects at a specific site developed for any earthquake. The reports assembled in this chapter, which provide an account of these seismic effects, constitute a unique collection of studies on site, as well as infrastructure and societal, response that cover virtually all aspects of the earthquake, ranging from incoming ground waves to the outgoing airwaves used for emergency communication. The Marina District encompasses the area bounded by San Francisco Bay on the north, the Presidio on the west, and Lombard Street and Van Ness Avenue on the south and east, respectively. Nearly all of the earthquake damage in the Marina District, however, occurred within a considerably smaller area of about 0.75 km2, bounded by San Francisco Bay and Baker, Chestnut, and Buchanan Streets. At least five major aspects of earthquake response in the Marina District are covered by the reports in this chapter: (1) dynamic site response, (2) soil liquefaction, (3) lifeline performance, (4) building performance, and (5) emergency services.

  11. A Triangular Approach to Integrate Research, Education and Practice in Higher Engineering Education

    ERIC Educational Resources Information Center

    Heikkinen, Eetu-Pekka; Jaako, Juha; Hiltunen, Jukka

    2017-01-01

    Separate approaches in engineering education, research and practice are not very useful when preparing students for working life; instead, integration of education, research and industrial practices is needed. A triangular approach (TA) as a method to accomplish this integration and as a method to provide students with integrated expertise is…

  12. Prototype operational earthquake prediction system

    USGS Publications Warehouse

    Spall, Henry

    1986-01-01

    An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.

  13. Prevention of strong earthquakes: Goal or utopia?

    NASA Astrophysics Data System (ADS)

    Mukhamediev, Sh. A.

    2010-11-01

    earthquake because of dynamic fault propagation in the intact region. Some additional aspects of prevention of PSE are discussed. We conclude that in the near future, it is too early to consider the problem of prevention of a forthcoming strong earthquake as a practical task; otherwise, the results can prove to be very different from the desired ones. Nevertheless, it makes sense to continue studying this problem. The theoretical research and experimental investigation of the structure and properties of the regions where the prevention of a forthcoming strong earthquake is planned in the future are of primary importance.

  14. A practical model for economic evaluation of tissue-engineered therapies.

    PubMed

    Tan, Tien-En; Peh, Gary S L; Finkelstein, Eric A; Mehta, Jodhbir S

    2015-01-01

    Tissue-engineered therapies are being developed across virtually all fields of medicine. Some of these therapies are already in clinical use, while others are still in clinical trials or the experimental phase. Most initial studies in the evaluation of new therapies focus on demonstration of clinical efficacy. However, cost considerations or economic viability are just as important. Many tissue-engineered therapies have failed to be impactful because of shortcomings in economic competitiveness, rather than clinical efficacy. Furthermore, such economic viability studies should be performed early in the process of development, before significant investment has been made. Cost-minimization analysis combined with sensitivity analysis is a useful model for the economic evaluation of new tissue-engineered therapies. The analysis can be performed early in the development process, and can provide valuable information to guide further investment and research. The utility of the model is illustrated with the practical real-world example of tissue-engineered constructs for corneal endothelial transplantation. The authors have declared no conflicts of interest for this article. © 2015 Wiley Periodicals, Inc.

  15. Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.

    PubMed

    Kung, Yi-Wen; Chen, Sue-Huei

    2012-09-01

    This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.

  16. Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations

    NASA Astrophysics Data System (ADS)

    Dalguer, Luis A.; Fukushima, Yoshimitsu; Irikura, Kojiro; Wu, Changjiang

    2017-09-01

    Inspired by the first workshop on Best Practices in Physics-Based Fault Rupture Models for Seismic Hazard Assessment of Nuclear Installations (BestPSHANI) conducted by the International Atomic Energy Agency (IAEA) on 18-20 November, 2015 in Vienna (http://www-pub.iaea.org/iaeameetings/50896/BestPSHANI), this PAGEOPH topical volume collects several extended articles from this workshop as well as several new contributions. A total of 17 papers have been selected on topics ranging from the seismological aspects of earthquake cycle simulations for source-scaling evaluation, seismic source characterization, source inversion and ground motion modeling (based on finite fault rupture using dynamic, kinematic, stochastic and empirical Green's functions approaches) to the engineering application of simulated ground motion for the analysis of seismic response of structures. These contributions include applications to real earthquakes and description of current practice to assess seismic hazard in terms of nuclear safety in low seismicity areas, as well as proposals for physics-based hazard assessment for critical structures near large earthquakes. Collectively, the papers of this volume highlight the usefulness of physics-based models to evaluate and understand the physical causes of observed and empirical data, as well as to predict ground motion beyond the range of recorded data. Relevant importance is given on the validation and verification of the models by comparing synthetic results with observed data and empirical models.

  17. Earthquakes in Alaska

    USGS Publications Warehouse

    Haeussler, Peter J.; Plafker, George

    1995-01-01

    Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.

  18. Leveraging geodetic data to reduce losses from earthquakes

    USGS Publications Warehouse

    Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.

    2018-04-23

    analysis systems.Collaborate on research, development, and operation of affordable, high-precision seafloor geodetic methods that improve earthquake forecasting and event response.Advance computational techniques and instrumentation to enable use of strategies like repeat-pass imagery and low-cost geodetic sensors for earthquake response, monitoring, and research.Engage stakeholders and collaborate with partner institutions to foster operational and research objectives and to safeguard the continued health of geodetic infrastructure upon which we mutually depend.Maintaining a vibrant internal research program provides the foundation by which the EHP can remain an effective and trusted source for earthquake science. Exploiting abundant new data sources, evaluating and assimilating the latest science, and pursuing novel avenues of investigation are means to fulfilling the EHP’s core responsibilities and realizing the important scientific advances envisioned by its scientists. Central to the success of such a research program is engaging personnel with a breadth of competencies and a willingness and ability to adapt these to the program’s evolving priorities, enabling current staff to expand their skills and responsibilities, and planning holistically to meet shared workforce needs. In parallel, collaboration with external partners to support scientific investigations that complement ongoing internal research enables the EHP to strengthen earthquake information products by incorporating alternative perspectives and approaches and to study topics and geographic regions that cannot be adequately covered internally.With commensurate support from technical staff who possess diverse skills, including engineering, information technology, and proficiency in quantitative analysis combined with basic geophysical knowledge, the EHP can achieve the geodetic outcomes identified in this document.

  19. An Investigation on the Crustal Deformations in Istanbul after Eastern Marmara Earthquakes in 1999

    NASA Astrophysics Data System (ADS)

    Ozludemir, M.; Ozyasar, M.

    2008-12-01

    Since the introduction of the GPS technique in mid 1970's there has been great advances in positioning activities. Today such Global Navigational Satellite Systems (GNSS) based positioning techniques are widely used in daily geodetic applications. High order geodetic network measurements are one of such geodetic applications. Such networks are established to provide reliable infrastructures for all kind of geodetic work from the production of cadastral plans to the surveying processes during the construction of engineering structures. In fact such positional information obtained in such engineering surveys could be useful for other studies as well. One of such fields is geodynamic studies where such positional information could be valuable to understand the characteristics of tectonic movements. In Turkey being located in a tectonically active zones and having major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. In this paper an example of such engineering surveys is discussed. This example is the Istanbul GPS (Global Positioning System) Network, first established in 1997 and remeasured in 2005. Between these two measurement processes two major earthquakes took place, on August 17 and November 12, 1999 with magnitudes of 7.4 and 7.2, respectively. In the first measurement campaign in 1997, a network of about 700 points was measured, while in the second campaign in 2005 more than 1800 points were positioned. In these two campaigns are existing common points. The network covers the whole Istanbul area of about 6000 km2. All network points are located on the Eurasian plate to the north of the North Anatolian Fault Zone. In this study, the horizontal and vertical movements are presented and compared with the results obtained in geodynamic studies.

  20. The behaviour of reinforced concrete structure due to earthquake load using Time History analysis Method

    NASA Astrophysics Data System (ADS)

    Afifuddin, M.; Panjaitan, M. A. R.; Ayuna, D.

    2017-02-01

    Earthquakes are one of the most dangerous, destructive and unpredictable natural hazards, which can leave everything up to a few hundred kilometres in complete destruction in seconds. Indonesia has a unique position as an earthquake prone country. It is the place of the interaction for three tectonic plates, namely the Indo-Australian, Eurasian and Pacific plates. Banda Aceh is one of the cities that located in earthquake-prone areas. Due to the vulnerable conditions of Banda Aceh some efforts have been exerted to reduce these unfavourable conditions. Many aspects have been addressed, starting from community awareness up to engineering solutions. One of them is all buildings that build in the city should be designed as an earthquake resistant building. The objectives of this research are to observe the response of a reinforced concrete structure due to several types of earthquake load, and to see the performance of the structure after earthquake loads applied. After Tsunami in 2004 many building has been build, one of them is a hotel building located at simpang lima. The hotel is made of reinforced concrete with a height of 34.95 meters with a total area of 8872.5 m2 building. So far this building was the tallest building in Banda Aceh.

  1. LIDAR Investigation Of The 2004 Niigata Ken Chuetsu, Japan, Earthquake

    NASA Astrophysics Data System (ADS)

    Kayen, R.; Pack, R. T.; Sugimoto, S.; Tanaka, H.

    2005-12-01

    The 23 October 2004 Niigata Ken Chuetsu, Japan, Mw 6.6 earthquake was the most significant earthquake to affect Japan since the 1995 Kobe earthquake. Forty people were killed, almost 3,000 injured, and numerous landslides destroyed entire upland villages. Landslides and permanent ground deformation caused extensive damage to roads, rail lines and other lifelines, resulting in major economic disruption. The cities and towns most significantly affected by the earthquake were Nagaoka, Ojiya, and the mountainous rural areas of Yamakoshi village and Kawaguchi town. Our EERI team traveled with a tripod mounted LIDAR (Light Detection and Ranging) unit, a scanning-laser that creates ultra high-resolution 3-D digital terrain models of the earthquake damaged surfaces the ground, structures, and life-lines. This new technology allows for rapid and remote sensing of damaged terrain. Ground-based LIDAR has an accuracy range of 0.5-2.5 cm, and can illuminate targets up to 400m away from the sensor. During a single tripod-mounted LIDAR scan of 10 minutes, several million survey points are collected and processed into an ultra-high resolution terrain model of the damaged ground or structure. There are several benefits in acquiring these LIDAR data in the initial reconnaissance effort after the earthquake. First, we record the detailed failure morphologies of damaged ground and structures in order to make measurements that are either impractical or impossible by conventional survey means. The digital terrain models allow us to enlarge, enhance and rotate data in order to visualize damage in orientations and scales not previously possible. This ability to visualize damage allows us to better understand failure modes. Finally, LIDAR allows us to archive 3-D terrain models so that the engineering community can evaluate analytical and numerical models of deformation potential against detailed field measurements. Here, we discuss the findings of this 2004 Niigata Chuetsu Earthquake (M6

  2. Quantifying Earthquake Collapse Risk of Tall Steel Braced Frame Buildings Using Rupture-to-Rafters Simulations

    NASA Astrophysics Data System (ADS)

    Mourhatch, Ramses

    the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.

  3. Earthquakes, November-December 1973

    USGS Publications Warehouse

    Person, W.J.

    1974-01-01

    Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria. 

  4. Health education and promotion at the site of an emergency: experience from the Chinese Wenchuan earthquake response.

    PubMed

    Tian, Xiangyang; Zhao, Genming; Cao, Dequan; Wang, Duoquan; Wang, Liang

    2016-03-01

    Theories and strategies of social mobilization, capacity building, mass and interpersonal communication, as well as risk communication and behavioral change were used to develop health education and promotion campaigns to decrease and prevent injuries and infectious diseases among the survivors of the Wenchuan earthquake in May 2008. We evaluated the effectiveness of the campaigns and short-term interventions using mixed-methods. The earthquake survivors' health knowledge, skills, and practice improved significantly with respect to injury protection, food and water safety, environmental and personal hygiene, and disease prevention. No infectious disease outbreaks were reported after the earthquake, and the epidemic level was lower than before the earthquake. After a short-term intervention among the students of Leigu Township Primary and Junior School, the proportion of those with personal hygiene increased from 59.7% to 98.3% (p< 0.01). Of the sampled survivors from Wenchuan County, 92.3% reported to have improved their health knowledge and 54.9% improved their health practice (p< 0.01). Thus, health education and promotion during public health emergencies such as earthquakes play an important role in preventing injuries and infectious diseases among survivors. © The Author(s) 2014.

  5. Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment

    USGS Publications Warehouse

    Lin, K.-W.; Wald, D.J.

    2012-01-01

    When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.

  6. Comparative study of earthquake-related and non-earthquake-related head traumas using multidetector computed tomography

    PubMed Central

    Chu, Zhi-gang; Yang, Zhi-gang; Dong, Zhi-hui; Chen, Tian-wu; Zhu, Zhi-yu; Shao, Heng

    2011-01-01

    OBJECTIVE: The features of earthquake-related head injuries may be different from those of injuries obtained in daily life because of differences in circumstances. We aim to compare the features of head traumas caused by the Sichuan earthquake with those of other common head traumas using multidetector computed tomography. METHODS: In total, 221 patients with earthquake-related head traumas (the earthquake group) and 221 patients with other common head traumas (the non-earthquake group) were enrolled in our study, and their computed tomographic findings were compared. We focused the differences between fractures and intracranial injuries and the relationships between extracranial and intracranial injuries. RESULTS: More earthquake-related cases had only extracranial soft tissue injuries (50.7% vs. 26.2%, RR = 1.9), and fewer cases had intracranial injuries (17.2% vs. 50.7%, RR = 0.3) compared with the non-earthquake group. For patients with fractures and intracranial injuries, there were fewer cases with craniocerebral injuries in the earthquake group (60.6% vs. 77.9%, RR = 0.8), and the earthquake-injured patients had fewer fractures and intracranial injuries overall (1.5±0.9 vs. 2.5±1.8; 1.3±0.5 vs. 2.1±1.1). Compared with the non-earthquake group, the incidences of soft tissue injuries and cranial fractures combined with intracranial injuries in the earthquake group were significantly lower (9.8% vs. 43.7%, RR = 0.2; 35.1% vs. 82.2%, RR = 0.4). CONCLUSION: As depicted with computed tomography, the severity of earthquake-related head traumas in survivors was milder, and isolated extracranial injuries were more common in earthquake-related head traumas than in non-earthquake-related injuries, which may have been the result of different injury causes, mechanisms and settings. PMID:22012045

  7. Development of optimization-based probabilistic earthquake scenarios for the city of Tehran

    NASA Astrophysics Data System (ADS)

    Zolfaghari, M. R.; Peyghaleh, E.

    2016-01-01

    This paper presents the methodology and practical example for the application of optimization process to select earthquake scenarios which best represent probabilistic earthquake hazard in a given region. The method is based on simulation of a large dataset of potential earthquakes, representing the long-term seismotectonic characteristics in a given region. The simulation process uses Monte-Carlo simulation and regional seismogenic source parameters to generate a synthetic earthquake catalogue consisting of a large number of earthquakes, each characterized with magnitude, location, focal depth and fault characteristics. Such catalogue provides full distributions of events in time, space and size; however, demands large computation power when is used for risk assessment, particularly when other sources of uncertainties are involved in the process. To reduce the number of selected earthquake scenarios, a mixed-integer linear program formulation is developed in this study. This approach results in reduced set of optimization-based probabilistic earthquake scenario, while maintaining shape of hazard curves and full probabilistic picture by minimizing the error between hazard curves driven by full and reduced sets of synthetic earthquake scenarios. To test the model, the regional seismotectonic and seismogenic characteristics of northern Iran are used to simulate a set of 10,000-year worth of events consisting of some 84,000 earthquakes. The optimization model is then performed multiple times with various input data, taking into account probabilistic seismic hazard for Tehran city as the main constrains. The sensitivity of the selected scenarios to the user-specified site/return period error-weight is also assessed. The methodology could enhance run time process for full probabilistic earthquake studies like seismic hazard and risk assessment. The reduced set is the representative of the contributions of all possible earthquakes; however, it requires far less

  8. Learning Styles in Engineering Education: The Quest to Improve Didactic Practices

    ERIC Educational Resources Information Center

    Holvikivi, Jaana

    2007-01-01

    This article discusses a dilemma that engineering educators encounter when attempting to develop pedagogical methods: that of finding efficient and scientifically valid didactic practices. The multitude of methods offered by educational consultants is perplexing. Moreover, the popularity of commercially offered solutions such as learning styles…

  9. Earthquakes, May-June 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage. 

  10. Nowcasting Earthquakes: A Comparison of Induced Earthquakes in Oklahoma and at the Geysers, California

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.

    2018-01-01

    Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.

  11. Lessons of L'Aquila for Operational Earthquake Forecasting

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.

    2012-12-01

    The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms

  12. Earthquakes; January-February, 1979

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The first major earthquake (magnitude 7.0 to 7.9) of the year struck in southeastern Alaska in a sparsely populated area on February 28. On January 16, Iran experienced the first destructive earthquake of the year causing a number of casualties and considerable damage. Peru was hit by a destructive earthquake on February 16 that left casualties and damage. A number of earthquakes were experienced in parts of the Untied States, but only minor damage was reported. 

  13. Preliminary results on earthquake triggered landslides for the Haiti earthquake (January 2010)

    NASA Astrophysics Data System (ADS)

    van Westen, Cees; Gorum, Tolga

    2010-05-01

    This study presents the first results on an analysis of the landslides triggered by the Ms 7.0 Haiti earthquake that occurred on January 12, 2010 in the boundary region of the Pacific Plate and the North American plate. The fault is a left lateral strike slip fault with a clear surface expression. According to the USGS earthquake information the Enriquillo-Plantain Garden fault system has not produced any major earthquake in the last 100 years, and historical earthquakes are known from 1860, 1770, 1761, 1751, 1684, 1673, and 1618, though none of these has been confirmed in the field as associated with this fault. We used high resolution satellite imagery available for the pre and post earthquake situations, which were made freely available for the response and rescue operations. We made an interpretation of all co-seismic landslides in the epicentral area. We conclude that the earthquake mainly triggered landslide in the northern slope of the fault-related valley and in a number of isolated area. The earthquake apparently didn't trigger many visible landslides within the slum areas on the slopes in the southern part of Port-au-Prince and Carrefour. We also used ASTER DEM information to relate the landslide occurrences with DEM derivatives.

  14. Limitation of the Predominant-Period Estimator for Earthquake Early Warning and the Initial Rupture of Earthquakes

    NASA Astrophysics Data System (ADS)

    Yamada, T.; Ide, S.

    2007-12-01

    Earthquake early warning is an important and challenging issue for the reduction of the seismic damage, especially for the mitigation of human suffering. One of the most important problems in earthquake early warning systems is how immediately we can estimate the final size of an earthquake after we observe the ground motion. It is relevant to the problem whether the initial rupture of an earthquake has some information associated with its final size. Nakamura (1988) developed the Urgent Earthquake Detection and Alarm System (UrEDAS). It calculates the predominant period of the P wave (τp) and estimates the magnitude of an earthquake immediately after the P wave arrival from the value of τpmax, or the maximum value of τp. The similar approach has been adapted by other earthquake alarm systems (e.g., Allen and Kanamori (2003)). To investigate the characteristic of the parameter τp and the effect of the length of the time window (TW) in the τpmax calculation, we analyze the high-frequency recordings of earthquakes at very close distances in the Mponeng mine in South Africa. We find that values of τpmax have upper and lower limits. For larger earthquakes whose source durations are longer than TW, the values of τpmax have an upper limit which depends on TW. On the other hand, the values for smaller earthquakes have a lower limit which is proportional to the sampling interval. For intermediate earthquakes, the values of τpmax are close to their typical source durations. These two limits and the slope for intermediate earthquakes yield an artificial final size dependence of τpmax in a wide size range. The parameter τpmax is useful for detecting large earthquakes and broadcasting earthquake early warnings. However, its dependence on the final size of earthquakes does not suggest that the earthquake rupture is deterministic. This is because τpmax does not always have a direct relation to the physical quantities of an earthquake.

  15. Sun, Moon and Earthquakes

    NASA Astrophysics Data System (ADS)

    Kolvankar, V. G.

    2013-12-01

    During a study conducted to find the effect of Earth tides on the occurrence of earthquakes, for small areas [typically 1000km X1000km] of high-seismicity regions, it was noticed that the Sun's position in terms of universal time [GMT] shows links to the sum of EMD [longitude of earthquake location - longitude of Moon's foot print on earth] and SEM [Sun-Earth-Moon angle]. This paper provides the details of this relationship after studying earthquake data for over forty high-seismicity regions of the world. It was found that over 98% of the earthquakes for these different regions, examined for the period 1973-2008, show a direct relationship between the Sun's position [GMT] and [EMD+SEM]. As the time changes from 00-24 hours, the factor [EMD+SEM] changes through 360 degree, and plotting these two variables for earthquakes from different small regions reveals a simple 45 degree straight-line relationship between them. This relationship was tested for all earthquakes and earthquake sequences for magnitude 2.0 and above. This study conclusively proves how Sun and the Moon govern all earthquakes. Fig. 12 [A+B]. The left-hand figure provides a 24-hour plot for forty consecutive days including the main event (00:58:23 on 26.12.2004, Lat.+3.30, Long+95.980, Mb 9.0, EQ count 376). The right-hand figure provides an earthquake plot for (EMD+SEM) vs GMT timings for the same data. All the 376 events including the main event faithfully follow the straight-line curve.

  16. Are Earthquakes Predictable? A Study on Magnitude Correlations in Earthquake Catalog and Experimental Data

    NASA Astrophysics Data System (ADS)

    Stavrianaki, K.; Ross, G.; Sammonds, P. R.

    2015-12-01

    The clustering of earthquakes in time and space is widely accepted, however the existence of correlations in earthquake magnitudes is more questionable. In standard models of seismic activity, it is usually assumed that magnitudes are independent and therefore in principle unpredictable. Our work seeks to test this assumption by analysing magnitude correlation between earthquakes and their aftershocks. To separate mainshocks from aftershocks, we perform stochastic declustering based on the widely used Epidemic Type Aftershock Sequence (ETAS) model, which allows us to then compare the average magnitudes of aftershock sequences to that of their mainshock. The results of earthquake magnitude correlations were compared with acoustic emissions (AE) from laboratory analog experiments, as fracturing generates both AE at the laboratory scale and earthquakes on a crustal scale. Constant stress and constant strain rate experiments were done on Darley Dale sandstone under confining pressure to simulate depth of burial. Microcracking activity inside the rock volume was analyzed by the AE technique as a proxy for earthquakes. Applying the ETAS model to experimental data allowed us to validate our results and provide for the first time a holistic view on the correlation of earthquake magnitudes. Additionally we search the relationship between the conditional intensity estimates of the ETAS model and the earthquake magnitudes. A positive relation would suggest the existence of magnitude correlations. The aim of this study is to observe any trends of dependency between the magnitudes of aftershock earthquakes and the earthquakes that trigger them.

  17. Statistical validation of earthquake related observations

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2011-12-01

    The confirmed fractal nature of earthquakes and their distribution in space and time implies that many traditional estimations of seismic hazard (from term-less to short-term ones) are usually based on erroneous assumptions of easy tractable or, conversely, delicately-designed models. The widespread practice of deceptive modeling considered as a "reasonable proxy" of the natural seismic process leads to seismic hazard assessment of unknown quality, which errors propagate non-linearly into inflicted estimates of risk and, eventually, into unexpected societal losses of unacceptable level. The studies aimed at forecast/prediction of earthquakes must include validation in the retro- (at least) and, eventually, in prospective tests. In the absence of such control a suggested "precursor/signal" remains a "candidate", which link to target seismic event is a model assumption. Predicting in advance is the only decisive test of forecast/predictions and, therefore, the score-card of any "established precursor/signal" represented by the empirical probabilities of alarms and failures-to-predict achieved in prospective testing must prove statistical significance rejecting the null-hypothesis of random coincidental occurrence in advance target earthquakes. We reiterate suggesting so-called "Seismic Roulette" null-hypothesis as the most adequate undisturbed random alternative accounting for the empirical spatial distribution of earthquakes: (i) Consider a roulette wheel with as many sectors as the number of earthquake locations from a sample catalog representing seismic locus, a sector per each location and (ii) make your bet according to prediction (i.e., determine, which locations are inside area of alarm, and put one chip in each of the corresponding sectors); (iii) Nature turns the wheel; (iv) accumulate statistics of wins and losses along with the number of chips spent. If a precursor in charge of prediction exposes an imperfection of Seismic Roulette then, having in mind

  18. Earthquake Early Warning and Public Policy: Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Goltz, J. D.; Bourque, L.; Tierney, K.; Riopelle, D.; Shoaf, K.; Seligson, H.; Flores, P.

    2003-12-01

    Development of an earthquake early warning capability and pilot project were objectives of TriNet, a 5-year (1997-2001) FEMA-funded project to develop a state-of-the-art digital seismic network in southern California. In parallel with research to assemble a protocol for rapid analysis of earthquake data and transmission of a signal by TriNet scientists and engineers, the public policy, communication and educational issues inherent in implementation of an earthquake early warning system were addressed by TriNet's outreach component. These studies included: 1) a survey that identified potential users of an earthquake early warning system and how an earthquake early warning might be used in responding to an event, 2) a review of warning systems and communication issues associated with other natural hazards and how lessons learned might be applied to an alerting system for earthquakes, 3) an analysis of organization, management and public policy issues that must be addressed if a broad-based warning system is to be developed and 4) a plan to provide earthquake early warnings to a small number of organizations in southern California as an experimental prototype. These studies provided needed insights into the social and cultural environment in which this new technology will be introduced, an environment with opportunities to enhance our response capabilities but also an environment with significant barriers to overcome to achieve a system that can be sustained and supported. In this presentation we will address the main public policy issues that were subjects of analysis in these studies. They include a discussion of the possible division of functions among organizations likely to be the principle partners in the management of an earthquake early warning system. Drawing on lessons learned from warning systems for other hazards, we will review the potential impacts of false alarms and missed events on warning system credibility, the acceptability of fully automated

  19. Putting down roots in earthquake country-Your handbook for earthquakes in the Central United States

    USGS Publications Warehouse

    Contributors: Dart, Richard; McCarthy, Jill; McCallister, Natasha; Williams, Robert A.

    2011-01-01

    This handbook provides information to residents of the Central United States about the threat of earthquakes in that area, particularly along the New Madrid seismic zone, and explains how to prepare for, survive, and recover from such events. It explains the need for concern about earthquakes for those residents and describes what one can expect during and after an earthquake. Much is known about the threat of earthquakes in the Central United States, including where they are likely to occur and what can be done to reduce losses from future earthquakes, but not enough has been done to prepare for future earthquakes. The handbook describes such preparations that can be taken by individual residents before an earthquake to be safe and protect property.

  20. International Collaboration for Strengthening Capacity to Assess Earthquake Hazard in Indonesia

    NASA Astrophysics Data System (ADS)

    Cummins, P. R.; Hidayati, S.; Suhardjono, S.; Meilano, I.; Natawidjaja, D.

    2012-12-01

    Indonesia has experienced a dramatic increase in earthquake risk due to rapid population growth in the 20th century, much of it occurring in areas near the subduction zone plate boundaries that are prone to earthquake occurrence. While recent seismic hazard assessments have resulted in better building codes that can inform safer building practices, many of the fundamental parameters controlling earthquake occurrence and ground shaking - e.g., fault slip rates, earthquake scaling relations, ground motion prediction equations, and site response - could still be better constrained. In recognition of the need to improve the level of information on which seismic hazard assessments are based, the Australian Agency for International Development (AusAID) and Indonesia's National Agency for Disaster Management (BNPB), through the Australia-Indonesia Facility for Disaster Reduction, have initiated a 4-year project designed to strengthen the Government of Indonesia's capacity to reliably assess earthquake hazard. This project is a collaboration of Australian institutions including Geoscience Australia and the Australian National University, with Indonesian government agencies and universities including the Agency for Meteorology, Climatology and Geophysics, the Geological Agency, the Indonesian Institute of Sciences, and Bandung Institute of Technology. Effective earthquake hazard assessment requires input from many different types of research, ranging from geological studies of active faults, seismological studies of crustal structure, earthquake sources and ground motion, PSHA methodology, and geodetic studies of crustal strain rates. The project is a large and diverse one that spans all these components, and these will be briefly reviewed in this presentation

  1. Maximum magnitude of injection-induced earthquakes: A criterion to assess the influence of pressure migration along faults

    NASA Astrophysics Data System (ADS)

    Norbeck, Jack H.; Horne, Roland N.

    2018-05-01

    The maximum expected earthquake magnitude is an important parameter in seismic hazard and risk analysis because of its strong influence on ground motion. In the context of injection-induced seismicity, the processes that control how large an earthquake will grow may be influenced by operational factors under engineering control as well as natural tectonic factors. Determining the relative influence of these effects on maximum magnitude will impact the design and implementation of induced seismicity management strategies. In this work, we apply a numerical model that considers the coupled interactions of fluid flow in faulted porous media and quasidynamic elasticity to investigate the earthquake nucleation, rupture, and arrest processes for cases of induced seismicity. We find that under certain conditions, earthquake ruptures are confined to a pressurized region along the fault with a length-scale that is set by injection operations. However, earthquakes are sometimes able to propagate as sustained ruptures outside of the zone that experienced a pressure perturbation. We propose a faulting criterion that depends primarily on the state of stress and the earthquake stress drop to characterize the transition between pressure-constrained and runaway rupture behavior.

  2. Repeating Earthquakes Following an Mw 4.4 Earthquake Near Luther, Oklahoma

    NASA Astrophysics Data System (ADS)

    Clements, T.; Keranen, K. M.; Savage, H. M.

    2015-12-01

    An Mw 4.4 earthquake on April 16, 2013 near Luther, OK was one of the earliest M4+ earthquakes in central Oklahoma, following the Prague sequence in 2011. A network of four local broadband seismometers deployed within a day of the Mw 4.4 event, along with six Oklahoma netquake stations, recorded more than 500 aftershocks in the two weeks following the Luther earthquake. Here we use HypoDD (Waldhauser & Ellsworth, 2000) and waveform cross-correlation to obtain precise aftershock locations. The location uncertainty, calculated using the SVD method in HypoDD, is ~15 m horizontally and ~ 35 m vertically. The earthquakes define a near vertical, NE-SW striking fault plane. Events occur at depths from 2 km to 3.5 km within the granitic basement, with a small fraction of events shallower, near the sediment-basement interface. Earthquakes occur within a zone of ~200 meters thickness on either side of the best-fitting fault surface. We use an equivalency class algorithm to identity clusters of repeating events, defined as event pairs with median three-component correlation > 0.97 across common stations (Aster & Scott, 1993). Repeating events occur as doublets of only two events in over 50% of cases; overall, 41% of earthquakes recorded occur as repeating events. The recurrence intervals for the repeating events range from minutes to days, with common recurrence intervals of less than two minutes. While clusters occur in tight dimensions, commonly of 80 m x 200 m, aftershocks occur in 3 distinct ~2km x 2km-sized patches along the fault. Our analysis suggests that with rapidly deployed local arrays, the plethora of ~Mw 4 earthquakes occurring in Oklahoma and Southern Kansas can be used to investigate the earthquake rupture process and the role of damage zones.

  3. Reflections on Communicating Science during the Canterbury Earthquake Sequence of 2010-2011, New Zealand

    NASA Astrophysics Data System (ADS)

    Wein, A. M.; Berryman, K. R.; Jolly, G. E.; Brackley, H. L.; Gledhill, K. R.

    2015-12-01

    The 2010-2011 Canterbury Earthquake Sequence began with the 4th September 2010 Darfield earthquake (Mw 7.1). Perhaps because there were no deaths, the mood of the city and the government was that high standards of earthquake engineering in New Zealand protected us, and there was a confident attitude to response and recovery. The demand for science and engineering information was of interest but not seen as crucial to policy, business or the public. The 22nd February 2011 Christchurch earthquake (Mw 6.2) changed all that; there was a significant death toll and many injuries. There was widespread collapse of older unreinforced and two relatively modern multi-storey buildings, and major disruption to infrastructure. The contrast in the interest and relevance of the science could not have been greater compared to 5 months previously. Magnitude 5+ aftershocks over a 20 month period resulted in confusion, stress, an inability to define a recovery trajectory, major concerns about whether insurers and reinsurers would continue to provide cover, very high levels of media interest from New Zealand and around the world, and high levels of political risk. As the aftershocks continued there was widespread speculation as to what the future held. During the sequence, the science and engineering sector sought to coordinate and offer timely and integrated advice. However, other than GeoNet, the national geophysical monitoring network, there were few resources devoted to communication, with the result that it was almost always reactive. With hindsight we have identified the need to resource information gathering and synthesis, execute strategic assessments of stakeholder needs, undertake proactive communication, and develop specific information packages for the diversity of users. Overall this means substantially increased resources. Planning is now underway for the science sector to adopt the New Zealand standardised CIMS (Coordinated Incident Management System) structure for

  4. Earthquake Hazards.

    ERIC Educational Resources Information Center

    Donovan, Neville

    1979-01-01

    Provides a survey and a review of earthquake activity and global tectonics from the advancement of the theory of continental drift to the present. Topics include: an identification of the major seismic regions of the earth, seismic measurement techniques, seismic design criteria for buildings, and the prediction of earthquakes. (BT)

  5. Insights from industry: a quantitative analysis of engineers' perceptions of empathy and care within their practice

    NASA Astrophysics Data System (ADS)

    Hess, Justin L.; Strobel, Johannes; Pan, Rui Celia; Wachter Morris, Carrie A.

    2017-11-01

    This study focuses on two seldom-investigated skills or dispositions aligned with engineering habits of mind - empathy and care. In order to conduct quantitative research, we designed, explored the underlying structure of, validated, and tested the reliability of the Empathy and Care Questionnaire (ECQ), a new psychometric instrument. In the second part, we used the ECQ to explore the perceptions of empathy and care of alumni/ae of an internationally ranked US institution, along with how perceptions differed by work experience and gender. Results show that participants perceived empathy and care to be important in multiple respects, most notably in relational aspects of engineering practice. Engineers with more engineering experience were more likely to perceive empathy and care as existing in engineering practice and as important to their work. While these phenomena are sometimes depicted as feminine qualities, we found no gender differences among our respondents.

  6. Earthquakes, May-June 1981

    USGS Publications Warehouse

    Person, W.J.

    1981-01-01

    The months of May and June were somewhat quiet, seismically speaking. There was one major earthquake (7.0-7.9) off the west coast of South Island, New Zealand. The most destructive earthquake during this reporting period was in southern Iran on June 11 which caused fatalities and extensive damage. Peru also experienced a destructive earthquake on June 22 which caused fatalities and damage. In the United States, a number of earthquakes were experienced, but none caused significant damage. 

  7. ViscoSim Earthquake Simulator

    USGS Publications Warehouse

    Pollitz, Fred

    2012-01-01

    Synthetic seismicity simulations have been explored by the Southern California Earthquake Center (SCEC) Earthquake Simulators Group in order to guide long‐term forecasting efforts related to the Unified California Earthquake Rupture Forecast (Tullis et al., 2012a). In this study I describe the viscoelastic earthquake simulator (ViscoSim) of Pollitz, 2009. Recapitulating to a large extent material previously presented by Pollitz (2009, 2011) I describe its implementation of synthetic ruptures and how it differs from other simulators being used by the group.

  8. Earthquake damage orientation to infer seismic parameters in archaeological sites and historical earthquakes

    NASA Astrophysics Data System (ADS)

    Martín-González, Fidel

    2018-01-01

    Studies to provide information concerning seismic parameters and seismic sources of historical and archaeological seismic events are used to better evaluate the seismic hazard of a region. This is of especial interest when no surface rupture is recorded or the seismogenic fault cannot be identified. The orientation pattern of the earthquake damage (ED) (e.g., fallen columns, dropped key stones) that affected architectonic elements of cities after earthquakes has been traditionally used in historical and archaeoseismological studies to infer seismic parameters. However, in the literature depending on the authors, the parameters that can be obtained are contradictory (it has been proposed: the epicenter location, the orientation of the P-waves, the orientation of the compressional strain and the fault kinematics) and authors even question these relations with the earthquake damage. The earthquakes of Lorca in 2011, Christchurch in 2011 and Emilia Romagna in 2012 present an opportunity to measure systematically a large number and wide variety of earthquake damage in historical buildings (the same structures that are used in historical and archaeological studies). The damage pattern orientation has been compared with modern instrumental data, which is not possible in historical and archaeoseismological studies. From measurements and quantification of the orientation patterns in the studied earthquakes, it is observed that there is a systematic pattern of the earthquake damage orientation (EDO) in the proximity of the seismic source (fault trace) (<10 km). The EDO in these earthquakes is normal to the fault trend (±15°). This orientation can be generated by a pulse of motion that in the near fault region has a distinguishable acceleration normal to the fault due to the polarization of the S-waves. Therefore, the earthquake damage orientation could be used to estimate the seismogenic fault trend of historical earthquakes studies where no instrumental data are available.

  9. On Earthquake Prediction in Japan

    PubMed Central

    UYEDA, Seiya

    2013-01-01

    Japan’s National Project for Earthquake Prediction has been conducted since 1965 without success. An earthquake prediction should be a short-term prediction based on observable physical phenomena or precursors. The main reason of no success is the failure to capture precursors. Most of the financial resources and manpower of the National Project have been devoted to strengthening the seismographs networks, which are not generally effective for detecting precursors since many of precursors are non-seismic. The precursor research has never been supported appropriately because the project has always been run by a group of seismologists who, in the present author’s view, are mainly interested in securing funds for seismology — on pretense of prediction. After the 1995 Kobe disaster, the project decided to give up short-term prediction and this decision has been further fortified by the 2011 M9 Tohoku Mega-quake. On top of the National Project, there are other government projects, not formally but vaguely related to earthquake prediction, that consume many orders of magnitude more funds. They are also un-interested in short-term prediction. Financially, they are giants and the National Project is a dwarf. Thus, in Japan now, there is practically no support for short-term prediction research. Recently, however, substantial progress has been made in real short-term prediction by scientists of diverse disciplines. Some promising signs are also arising even from cooperation with private sectors. PMID:24213204

  10. Earthquakes, March-April, 1993

    USGS Publications Warehouse

    Person, Waverly J.

    1993-01-01

    Worldwide, only one major earthquake (7.0earthquake, a magnitude 7.2 shock, struck the Santa Cruz Islands region in the South Pacific on March 6. Earthquake-related deaths occurred in the Fiji Islands, China, and Peru.

  11. 2016 National Earthquake Conference

    Science.gov Websites

    Thank you to our Presenting Sponsor, California Earthquake Authority. What's New? What's Next ? What's Your Role in Building a National Strategy? The National Earthquake Conference (NEC) is a , state government leaders, social science practitioners, U.S. State and Territorial Earthquake Managers

  12. Teaching medical management and operations engineering for systems-based practice to radiology residents.

    PubMed

    Brandon, Catherine J; Mullan, Patricia B

    2013-03-01

    To better prepare radiology residents for providing care within the context of the larger health care system, this study evaluated the feasibility and impact of a curriculum to enhance radiology residents' understanding and ability to apply concepts from medical management and industrial and operational engineering to systems-based practice problems in radiology practice. A multiprofessional team including radiology, medical education, and industrial and operational engineering professionals collaborated in developing a seven-module curriculum, including didactic lectures, interactive large-group analysis, and small-group discussions with case-based radiology examples, which illustrated real-life management issues and the roles physicians held. Residents and faculty participated in topic selection. Pre- and post-instruction formative assessments were administered, and results were shared with residents during teaching sessions. Attendance and participation in case-based scenario resolutions indicate the feasibility and impact of the interactive curriculum on residents' interest and ability to apply curricular concepts to systems-based practice in radiology. Paired t test analyses (P < .05) and effect sizes showed residents significantly increased their knowledge and ability to apply concepts to systems-based practice issues in radiology. Our iterative curriculum development and implementation process demonstrated need and support for a multiprofessional team approach to teach management and operational engineering concepts. Curriculum topics are congruent with Accreditation Council for Graduate Medical Education requirements for systems-based practice. The case-based curriculum using a mixed educational format of didactic lectures and small-group discussion and problem analysis could be adopted for other radiology programs, for both residents and continuing medical education applications. Copyright © 2013 AUR. Published by Elsevier Inc. All rights reserved.

  13. The music of earthquakes and Earthquake Quartet #1

    USGS Publications Warehouse

    Michael, Andrew J.

    2013-01-01

    Earthquake Quartet #1, my composition for voice, trombone, cello, and seismograms, is the intersection of listening to earthquakes as a seismologist and performing music as a trombonist. Along the way, I realized there is a close relationship between what I do as a scientist and what I do as a musician. A musician controls the source of the sound and the path it travels through their instrument in order to make sound waves that we hear as music. An earthquake is the source of waves that travel along a path through the earth until reaching us as shaking. It is almost as if the earth is a musician and people, including seismologists, are metaphorically listening and trying to understand what the music means.

  14. Earthquake triggering in southeast Africa following the 2012 Indian Ocean earthquake

    NASA Astrophysics Data System (ADS)

    Neves, Miguel; Custódio, Susana; Peng, Zhigang; Ayorinde, Adebayo

    2018-02-01

    In this paper we present evidence of earthquake dynamic triggering in southeast Africa. We analysed seismic waveforms recorded at 53 broad-band and short-period stations in order to identify possible increases in the rate of microearthquakes and tremor due to the passage of teleseismic waves generated by the Mw8.6 2012 Indian Ocean earthquake. We found evidence of triggered local earthquakes and no evidence of triggered tremor in the region. We assessed the statistical significance of the increase in the number of local earthquakes using β-statistics. Statistically significant dynamic triggering of local earthquakes was observed at 7 out of the 53 analysed stations. Two of these stations are located in the northeast coast of Madagascar and the other five stations are located in the Kaapvaal Craton, southern Africa. We found no evidence of dynamically triggered seismic activity in stations located near the structures of the East African Rift System. Hydrothermal activity exists close to the stations that recorded dynamic triggering, however, it also exists near the East African Rift System structures where no triggering was observed. Our results suggest that factors other than solely tectonic regime and geothermalism are needed to explain the mechanisms that underlie earthquake triggering.

  15. Toward real-time regional earthquake simulation of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, S.; Liu, Q.; Tromp, J.; Komatitsch, D.; Liang, W.; Huang, B.

    2013-12-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 minutes after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 minutes for a 70 sec ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  16. Using Teaching Portfolios to Revise Curriculum and Explore Instructional Practices of Technology and Engineering Education Teachers

    ERIC Educational Resources Information Center

    Lomask, Michal; Crismond, David; Hacker, Michael

    2018-01-01

    This paper reports on the use of teaching portfolios to assist in curriculum revision and the exploration of instructional practices used by middle school technology and engineering education teachers. Two new middle school technology and engineering education units were developed through the Engineering for All (EfA) project. One EfA unit focused…

  17. Geophysical Anomalies and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Jackson, D. D.

    2008-12-01

    Finding anomalies is easy. Predicting earthquakes convincingly from such anomalies is far from easy. Why? Why have so many beautiful geophysical abnormalities not led to successful prediction strategies? What is earthquake prediction? By my definition it is convincing information that an earthquake of specified size is temporarily much more likely than usual in a specific region for a specified time interval. We know a lot about normal earthquake behavior, including locations where earthquake rates are higher than elsewhere, with estimable rates and size distributions. We know that earthquakes have power law size distributions over large areas, that they cluster in time and space, and that aftershocks follow with power-law dependence on time. These relationships justify prudent protective measures and scientific investigation. Earthquake prediction would justify exceptional temporary measures well beyond those normal prudent actions. Convincing earthquake prediction would result from methods that have demonstrated many successes with few false alarms. Predicting earthquakes convincingly is difficult for several profound reasons. First, earthquakes start in tiny volumes at inaccessible depth. The power law size dependence means that tiny unobservable ones are frequent almost everywhere and occasionally grow to larger size. Thus prediction of important earthquakes is not about nucleation, but about identifying the conditions for growth. Second, earthquakes are complex. They derive their energy from stress, which is perniciously hard to estimate or model because it is nearly singular at the margins of cracks and faults. Physical properties vary from place to place, so the preparatory processes certainly vary as well. Thus establishing the needed track record for validation is very difficult, especially for large events with immense interval times in any one location. Third, the anomalies are generally complex as well. Electromagnetic anomalies in particular require

  18. The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing

    USGS Publications Warehouse

    Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne

    2008-01-01

    The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and

  19. Data Management Practices and Perspectives of Atmospheric Scientists and Engineering Faculty

    ERIC Educational Resources Information Center

    Wiley, Christie; Mischo, William H.

    2016-01-01

    This article analyzes 21 in-depth interviews of engineering and atmospheric science faculty at the University of Illinois Urbana-Champaign (UIUC) to determine faculty data management practices and needs within the context of their research activities. A detailed literature review of previous large-scale and institutional surveys and interviews…

  20. Contextual Shaping of Student Design Practices: The Role of Constraint in First-Year Engineering Design

    NASA Astrophysics Data System (ADS)

    Goncher, Andrea M.

    thResearch on engineering design is a core area of concern within engineering education, and a fundamental understanding of how engineering students approach and undertake design is necessary in order to develop effective design models and pedagogies. This dissertation contributes to scholarship on engineering design by addressing a critical, but as yet underexplored, problem: how does the context in which students design shape their design practices? Using a qualitative study comprising of video data of design sessions, focus group interviews with students, and archives of their design work, this research explored how design decisions and actions are shaped by context, specifically the context of higher education. To develop a theoretical explanation for observed behavior, this study used the nested structuration. framework proposed by Perlow, Gittell, & Katz (2004). This framework explicated how teamwork is shaped by mutually reinforcing relationships at the individual, organizational, and institutional levels. I appropriated this framework to look specifically at how engineering students working on a course-related design project identify constraints that guide their design and how these constraints emerge as students interact while working on the project. I first identified and characterized the parameters associated with the design project from the student perspective and then, through multi-case studies of four design teams, I looked at the role these parameters play in student design practices. This qualitative investigation of first-year engineering student design teams revealed mutual and interconnected relationships between students and the organizations and institutions that they are a part of. In addition to contributing to research on engineering design, this work provides guidelines and practices to help design educators develop more effective design projects by incorporating constraints that enable effective design and learning. Moreover, I found

  1. Surface rupture of the 2002 Denali fault, Alaska, earthquake and comparison with other strike-slip ruptures

    USGS Publications Warehouse

    Haeussler, Peter J.; Schwartz, D.P.; Dawson, T.E.; Stenner, Heidi D.; Lienkaemper, J.J.; Cinti, F.; Montone, Paola; Sherrod, B.; Craw, P.

    2004-01-01

    On 3 November 2002, an M7.9 earthquake produced 340 km of surface rupture on the Denali and two related faults in Alaska. The rupture proceeded from west to east and began with a 40-km-long break on a previously unknown thrust fault. Estimates of surface slip on this thrust are 3-6 m. Next came the principal surface break along ???218 km of the Denali fault. Right-lateral offsets averaged around 5 m and increased eastward to a maximum of nearly 9 m. The fault also ruptured beneath the trans-Alaska oil pipeline, which withstood almost 6 m of lateral offset. Finally, slip turned southeastward onto the Totschunda fault. Right-lateral offsets are up to 3 m, and the surface rupture is about 76 km long. This three-part rupture ranks among the longest strike-slip events of the past two centuries. The earthquake is typical when compared to other large earthquakes on major intracontinental strike-slip faults. ?? 2004, Earthquake Engineering Research Institute.

  2. Report of Earthquake Drills with Experiences of Ground Motion in Childcare for Young Children, Japan

    NASA Astrophysics Data System (ADS)

    Yamada, N.

    2013-12-01

    After the Great East Japan Earthquake of 2011, this disaster has become one of the opportunities to raise awareness of earthquake and tsunami disaster prevention, and the improvement of disaster prevention education is to be emphasized. The influences of these bring the extension to the spatial axis in Japan, and also, it is important to make a development of the education with continuous to the expansion of time axes. Although fire or earthquake drills as the disaster prevention education are often found in Japan, the children and teachers only go from school building to outside. Besides, only the shortness of the time to spend for the drill often attracts attention. The complementary practice education by the cooperation with experts such as the firefighting is practiced, but the verification of the effects is not enough, and it is the present conditions that do not advance to the study either. Although it is expected that improvement and development of the disaster prevention educations are accomplished in future, there are a lot of the problems. Our target is construction and utilization of material contributing to the education about "During the strong motion" in case of the earthquake which may experience even if wherever of Japan. One of the our productions is the handicraft shaking table to utilize as teaching tools of the education to protect the body which is not hurt at the time of strong motion. This made much of simplicity than high reproduction of the earthquake ground motions. We aimed to helping the disaster prevention education including not only the education for young children but also for the school staff and their parents. In this report, the focusing on a way of the non-injured during the time of the earthquake ground motion, and adopting activity of the play, we are going to show the example of the framework of earthquake disaster prevention childcare through the virtual experience. This presentation has a discussion as a practice study with

  3. Earthquake triggering by seismic waves following the landers and hector mine earthquakes

    USGS Publications Warehouse

    Gomberg, J.; Reasenberg, P.A.; Bodin, P.; Harris, R.A.

    2001-01-01

    The proximity and similarity of the 1992, magnitude 7.3 Landers and 1999, magnitude 7.1 Hector Mine earthquakes in California permit testing of earthquake triggering hypotheses not previously possible. The Hector Mine earthquake confirmed inferences that transient, oscillatory 'dynamic' deformations radiated as seismic waves can trigger seismicity rate increases, as proposed for the Landers earthquake1-6. Here we quantify the spatial and temporal patterns of the seismicity rate changes7. The seismicity rate increase was to the north for the Landers earthquake and primarily to the south for the Hector Mine earthquake. We suggest that rupture directivity results in elevated dynamic deformations north and south of the Landers and Hector Mine faults, respectively, as evident in the asymmetry of the recorded seismic velocity fields. Both dynamic and static stress changes seem important for triggering in the near field with dynamic stress changes dominating at greater distances. Peak seismic velocities recorded for each earthquake suggest the existence of, and place bounds on, dynamic triggering thresholds. These thresholds vary from a few tenths to a few MPa in most places, depend on local conditions, and exceed inferred static thresholds by more than an order of magnitude. At some sites, the onset of triggering was delayed until after the dynamic deformations subsided. Physical mechanisms consistent with all these observations may be similar to those that give rise to liquefaction or cyclic fatigue.

  4. Analysis of pre-earthquake ionospheric anomalies before the global M = 7.0+ earthquakes in 2010

    NASA Astrophysics Data System (ADS)

    Yao, Y. B.; Chen, P.; Zhang, S.; Chen, J. J.; Yan, F.; Peng, W. F.

    2012-03-01

    The pre-earthquake ionospheric anomalies that occurred before the global M = 7.0+ earthquakes in 2010 are investigated using the total electron content (TEC) from the global ionosphere map (GIM). We analyze the possible causes of the ionospheric anomalies based on the space environment and magnetic field status. Results show that some anomalies are related to the earthquakes. By analyzing the time of occurrence, duration, and spatial distribution of these ionospheric anomalies, a number of new conclusions are drawn, as follows: earthquake-related ionospheric anomalies are not bound to appear; both positive and negative anomalies are likely to occur; and the earthquake-related ionospheric anomalies discussed in the current study occurred 0-2 days before the associated earthquakes and in the afternoon to sunset (i.e. between 12:00 and 20:00 local time). Pre-earthquake ionospheric anomalies occur mainly in areas near the epicenter. However, the maximum affected area in the ionosphere does not coincide with the vertical projection of the epicenter of the subsequent earthquake. The directions deviating from the epicenters do not follow a fixed rule. The corresponding ionospheric effects can also be observed in the magnetically conjugated region. However, the probability of the anomalies appearance and extent of the anomalies in the magnetically conjugated region are smaller than the anomalies near the epicenter. Deep-focus earthquakes may also exhibit very significant pre-earthquake ionospheric anomalies.

  5. Modeling, Forecasting and Mitigating Extreme Earthquakes

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.; Le Mouel, J.; Soloviev, A.

    2012-12-01

    Recent earthquake disasters highlighted the importance of multi- and trans-disciplinary studies of earthquake risk. A major component of earthquake disaster risk analysis is hazards research, which should cover not only a traditional assessment of ground shaking, but also studies of geodetic, paleoseismic, geomagnetic, hydrological, deep drilling and other geophysical and geological observations together with comprehensive modeling of earthquakes and forecasting extreme events. Extreme earthquakes (large magnitude and rare events) are manifestations of complex behavior of the lithosphere structured as a hierarchical system of blocks of different sizes. Understanding of physics and dynamics of the extreme events comes from observations, measurements and modeling. A quantitative approach to simulate earthquakes in models of fault dynamics will be presented. The models reproduce basic features of the observed seismicity (e.g., the frequency-magnitude relationship, clustering of earthquakes, occurrence of extreme seismic events). They provide a link between geodynamic processes and seismicity, allow studying extreme events, influence of fault network properties on seismic patterns and seismic cycles, and assist, in a broader sense, in earthquake forecast modeling. Some aspects of predictability of large earthquakes (how well can large earthquakes be predicted today?) will be also discussed along with possibilities in mitigation of earthquake disasters (e.g., on 'inverse' forensic investigations of earthquake disasters).

  6. Extending earthquakes' reach through cascading.

    PubMed

    Marsan, David; Lengliné, Olivier

    2008-02-22

    Earthquakes, whatever their size, can trigger other earthquakes. Mainshocks cause aftershocks to occur, which in turn activate their own local aftershock sequences, resulting in a cascade of triggering that extends the reach of the initial mainshock. A long-lasting difficulty is to determine which earthquakes are connected, either directly or indirectly. Here we show that this causal structure can be found probabilistically, with no a priori model nor parameterization. Large regional earthquakes are found to have a short direct influence in comparison to the overall aftershock sequence duration. Relative to these large mainshocks, small earthquakes collectively have a greater effect on triggering. Hence, cascade triggering is a key component in earthquake interactions.

  7. Fractal and chaotic laws on seismic dissipated energy in an energy system of engineering structures

    NASA Astrophysics Data System (ADS)

    Cui, Yu-Hong; Nie, Yong-An; Yan, Zong-Da; Wu, Guo-You

    1998-09-01

    Fractal and chaotic laws of engineering structures are discussed in this paper, it means that the intrinsic essences and laws on dynamic systems which are made from seismic dissipated energy intensity E d and intensity of seismic dissipated energy moment I e are analyzed. Based on the intrinsic characters of chaotic and fractal dynamic system of E d and I e, three kinds of approximate dynamic models are rebuilt one by one: index autoregressive model, threshold autoregressive model and local-approximate autoregressive model. The innate laws, essences and systematic error of evolutional behavior I e are explained over all, the short-term behavior predictability and long-term behavior probability of which are analyzed in the end. That may be valuable for earthquake-resistant theory and analysis method in practical engineering structures.

  8. Earthquake Catalogue of the Caucasus

    NASA Astrophysics Data System (ADS)

    Godoladze, T.; Gok, R.; Tvaradze, N.; Tumanova, N.; Gunia, I.; Onur, T.

    2016-12-01

    The Caucasus has a documented historical catalog stretching back to the beginning of the Christian era. Most of the largest historical earthquakes prior to the 19th century are assumed to have occurred on active faults of the Greater Caucasus. Important earthquakes include the Samtskhe earthquake of 1283 (Ms˜7.0, Io=9); Lechkhumi-Svaneti earthquake of 1350 (Ms˜7.0, Io=9); and the Alaverdi earthquake of 1742 (Ms˜6.8, Io=9). Two significant historical earthquakes that may have occurred within the Javakheti plateau in the Lesser Caucasus are the Tmogvi earthquake of 1088 (Ms˜6.5, Io=9) and the Akhalkalaki earthquake of 1899 (Ms˜6.3, Io =8-9). Large earthquakes that occurred in the Caucasus within the period of instrumental observation are: Gori 1920; Tabatskuri 1940; Chkhalta 1963; Racha earthquake of 1991 (Ms=7.0), is the largest event ever recorded in the region; Barisakho earthquake of 1992 (M=6.5); Spitak earthquake of 1988 (Ms=6.9, 100 km south of Tbilisi), which killed over 50,000 people in Armenia. Recently, permanent broadband stations have been deployed across the region as part of the various national networks (Georgia (˜25 stations), Azerbaijan (˜35 stations), Armenia (˜14 stations)). The data from the last 10 years of observation provides an opportunity to perform modern, fundamental scientific investigations. In order to improve seismic data quality a catalog of all instrumentally recorded earthquakes has been compiled by the IES (Institute of Earth Sciences/NSMC, Ilia State University) in the framework of regional joint project (Armenia, Azerbaijan, Georgia, Turkey, USA) "Probabilistic Seismic Hazard Assessment (PSHA) in the Caucasus. The catalogue consists of more then 80,000 events. First arrivals of each earthquake of Mw>=4.0 have been carefully examined. To reduce calculation errors, we corrected arrivals from the seismic records. We improved locations of the events and recalculate Moment magnitudes in order to obtain unified magnitude

  9. Earthquakes, July-August 1992

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0≤M<8.0) during this reporting period. A magnitude 7.5 earthquake occurred in Kyrgyzstan on August 19 and a magnitude 7.0 quake struck the Ascension Island region on August 28. In southern California, aftershocks of the magnitude 7.6 earthquake on June 28, 1992, continued. One of these aftershocks caused damage and injuries, and at least one other aftershock caused additional damage. Earthquake-related fatalities were reportred in Kyrgzstan and Pakistan. 

  10. The 7.9 Denali Fault Earthquake: Damage to Structures and Lifelines

    NASA Astrophysics Data System (ADS)

    Cox, T.; Hreinsdöttir, S.; Larsen, C.; Estes, S.

    2002-12-01

    In the early afternoon of Sunday, November 3rd, the residents of many interior Alaska towns were shaken up by a magnitude 7.9 earthquake. The shaking lasted an average of three minutes and when it stopped, nearly 300 km of the Denali Fault had ruptured. In the hours that followed, the Alaska Earthquake Information Center (AEIC) fielded reports of structural damage from Cantwell to Tok and other earthquake effects as far away as Louisiana. Upon investigation, the most severe effects were found in the village of Mentasta where basic utilities were interrupted and the school and several houses suffered major damage. Almost 3000 reports submitted to a community internet intensity map show a maximum Mercalli intensity VIII along the eastern end of the rupture area. The Richardson and Parks Highways, two main north-south thoroughfares in Alaska, both buckled and split as a result of the fault rupture. Traffic was stopped for a few hours while repairs were made. Between the Richardson Highway the Tok Cutoff, a section of the Glenn Highway that connects Tok and Glennallen, the maximum offsets on the Denali Fault were observed. Designed to withstand a magnitude 8.5 earthquake at the Denali Fault crossing, the 800-mile long Trans-Alaska Pipeline suffered relatively minor damage. According to Alyeska Pipeline Service Company press releases, the pipeline was shut down shortly after the earthquake occurred. Repairs to pipeline supports and engineering evaluations began immediately thereafter, and oil began flowing through the pipeline Thursday, November 7th . Through it all, the AEIC has collected and archived many photographs, emails, and eyewitness accounts of those who experienced the destruction firsthand. We will detail the effects that the M7.9 Denali Fault earthquake had from near and far.

  11. Emergency mapping and information management during Nepal Earthquake 2015 - Challenges and lesson learned

    NASA Astrophysics Data System (ADS)

    Joshi, G.; Gurung, D. R.

    2016-12-01

    A powerful 7.8 magnitude earthquake struck Nepal at 06:11 UTC on 25 April 2015. Several subsequent aftershocks were deadliest earthquake in recent history of Nepal. In total about 9000 people died and 22,300 people were injured, and lives of eight million people, almost one-third of the population of Nepal was effected. The event lead to massive campaigned to gather data and information on damage and loss using remote sensing, field inspection, and community survey. Information on distribution of relief materials is other important domain of information necessary for equitable relief distribution. Pre and post-earthquake high resolution satellite images helped in damage area assessment and mapping. Many national and international agencies became active to generate and fill the information vacuum. The challenges included data access bottleneck due to lack of good IT infrastructure; inconsistent products due to absence of standard mapping guidelines; dissemination challenges due to absence of Standard Operating Protocols and single information gateway. These challenges were negating opportunities offered by improved earth observation data availability, increasing engagement of volunteers for emergency mapping, and centralized emergency coordination practice. This paper highlights critical practical challenges encountered during emergency mapping and information management during the earthquake in Nepal. There is greater need to address such challenges to effectively use technological leverages that recent advancement in space science, IT and mapping domain provides.

  12. Earthquakes: Recurrence and Interoccurrence Times

    NASA Astrophysics Data System (ADS)

    Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.

    2008-04-01

    The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.

  13. Teachers' Integration of Scientific and Engineering Practices in Primary Classrooms

    NASA Astrophysics Data System (ADS)

    Merritt, Eileen G.; Chiu, Jennie; Peters-Burton, Erin; Bell, Randy

    2017-06-01

    The Next-Generation Science Standards (NGSS) challenge primary teachers and students to work and think like scientists and engineers as they strive to understand complex concepts. Teachers and teacher educators can leverage what is already known about inquiry teaching as they plan instruction to help students meet the new standards. This cross-case analysis of a multiple case study examined teacher practices in the context of a semester-long professional development course for elementary teachers. We reviewed lessons and teacher reflections, examining how kindergarten and first grade teachers incorporated NGSS scientific and engineering practices during inquiry-based instruction. We found that most of the teachers worked with their students on asking questions; planning and carrying out investigations; analyzing and interpreting data, using mathematics and computational thinking; and obtaining, evaluating and communicating information. Teachers faced challenges in supporting students in developing their own questions that could be investigated and using data collection strategies that aligned with students' development of number sense concepts. Also, some teachers overemphasized the scientific method and lacked clarity in how they elicited and responded to student predictions. Discussion focuses on teacher supports that will be needed as states transition to NGSS.

  14. Natural Time and Nowcasting Earthquakes: Are Large Global Earthquakes Temporally Clustered?

    NASA Astrophysics Data System (ADS)

    Luginbuhl, Molly; Rundle, John B.; Turcotte, Donald L.

    2018-02-01

    The objective of this paper is to analyze the temporal clustering of large global earthquakes with respect to natural time, or interevent count, as opposed to regular clock time. To do this, we use two techniques: (1) nowcasting, a new method of statistically classifying seismicity and seismic risk, and (2) time series analysis of interevent counts. We chose the sequences of M_{λ } ≥ 7.0 and M_{λ } ≥ 8.0 earthquakes from the global centroid moment tensor (CMT) catalog from 2004 to 2016 for analysis. A significant number of these earthquakes will be aftershocks of the largest events, but no satisfactory method of declustering the aftershocks in clock time is available. A major advantage of using natural time is that it eliminates the need for declustering aftershocks. The event count we utilize is the number of small earthquakes that occur between large earthquakes. The small earthquake magnitude is chosen to be as small as possible, such that the catalog is still complete based on the Gutenberg-Richter statistics. For the CMT catalog, starting in 2004, we found the completeness magnitude to be M_{σ } ≥ 5.1. For the nowcasting method, the cumulative probability distribution of these interevent counts is obtained. We quantify the distribution using the exponent, β, of the best fitting Weibull distribution; β = 1 for a random (exponential) distribution. We considered 197 earthquakes with M_{λ } ≥ 7.0 and found β = 0.83 ± 0.08. We considered 15 earthquakes with M_{λ } ≥ 8.0, but this number was considered too small to generate a meaningful distribution. For comparison, we generated synthetic catalogs of earthquakes that occur randomly with the Gutenberg-Richter frequency-magnitude statistics. We considered a synthetic catalog of 1.97 × 10^5 M_{λ } ≥ 7.0 earthquakes and found β = 0.99 ± 0.01. The random catalog converted to natural time was also random. We then generated 1.5 × 10^4 synthetic catalogs with 197 M_{λ } ≥ 7.0 in each catalog and

  15. Detection of change points in underlying earthquake rates, with application to global mega-earthquakes

    NASA Astrophysics Data System (ADS)

    Touati, Sarah; Naylor, Mark; Main, Ian

    2016-02-01

    The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large

  16. Structure-specific scalar intensity measures for near-source and ordinary earthquake ground motions

    USGS Publications Warehouse

    Luco, N.; Cornell, C.A.

    2007-01-01

    Introduced in this paper are several alternative ground-motion intensity measures (IMs) that are intended for use in assessing the seismic performance of a structure at a site susceptible to near-source and/or ordinary ground motions. A comparison of such IMs is facilitated by defining the "efficiency" and "sufficiency" of an IM, both of which are criteria necessary for ensuring the accuracy of the structural performance assessment. The efficiency and sufficiency of each alternative IM, which are quantified via (i) nonlinear dynamic analyses of the structure under a suite of earthquake records and (ii) linear regression analysis, are demonstrated for the drift response of three different moderate- to long-period buildings subjected to suites of ordinary and of near-source earthquake records. One of the alternative IMs in particular is found to be relatively efficient and sufficient for the range of buildings considered and for both the near-source and ordinary ground motions. ?? 2007, Earthquake Engineering Research Institute.

  17. Earthquake and Tsunami booklet based on two Indonesia earthquakes

    NASA Astrophysics Data System (ADS)

    Hayashi, Y.; Aci, M.

    2014-12-01

    Many destructive earthquakes occurred during the last decade in Indonesia. These experiences are very important precepts for the world people who live in earthquake and tsunami countries. We are collecting the testimonies of tsunami survivors to clarify successful evacuation process and to make clear the characteristic physical behaviors of tsunami near coast. We research 2 tsunami events, 2004 Indian Ocean tsunami and 2010 Mentawai slow earthquake tsunami. Many video and photographs were taken by people at some places in 2004 Indian ocean tsunami disaster; nevertheless these were few restricted points. We didn't know the tsunami behavior in another place. In this study, we tried to collect extensive information about tsunami behavior not only in many places but also wide time range after the strong shake. In Mentawai case, the earthquake occurred in night, so there are no impressive photos. To collect detail information about evacuation process from tsunamis, we contrived the interview method. This method contains making pictures of tsunami experience from the scene of victims' stories. In 2004 Aceh case, all survivors didn't know tsunami phenomena. Because there were no big earthquakes with tsunami for one hundred years in Sumatra region, public people had no knowledge about tsunami. This situation was highly improved in 2010 Mentawai case. TV programs and NGO or governmental public education programs about tsunami evacuation are widespread in Indonesia. Many people know about fundamental knowledge of earthquake and tsunami disasters. We made drill book based on victim's stories and painted impressive scene of 2 events. We used the drill book in disaster education event in school committee of west Java. About 80 % students and teachers evaluated that the contents of the drill book are useful for correct understanding.

  18. The MW 7.0 Haiti Earthquake of January 12, 2010: USGS/EERI Advance Reconnaissance Team Report

    USGS Publications Warehouse

    Eberhard, Marc O.; Baldridge, Steven; Marshall, Justin; Mooney, Walter; Rix, Glenn J.

    2010-01-01

    Executive Summary A field reconnaissance in Haiti by a five-member team with expertise in seismology and earthquake engineering has revealed a number of factors that led to catastrophic losses of life and property during the January 12, 2010, Mw 7.0 earthquake. The field study was conducted from January 26 to February 3, 2010, and included investigations in Port-au-Prince and the heavily damaged communities to the west, including Leogane, Grand Goave, Petite Goave, and Oliver. Seismology Despite recent seismic quiescence, Haiti has suffered similar devastating earthquakes in the historical past (1701, 1751, 1770 and 1860). Despite this knowledge of historical seismicity, Haiti had no seismograph stations during the main earthquake, so it is impossible to estimate accurately the intensity of ground motions. Nonetheless, the wide range of buildings damaged by the January 12, 2010, earthquake suggests that the ground motions contained seismic energy over a wide range of frequencies. Another earthquake of similar magnitude could strike at any time on the eastern end of the Enriquillo Fault, directly to the south of Port-au-Prince. Reconstruction must take this hazard into account. The four portable seismographs installed by the team recorded a series of small aftershocks. As expected, the ground motions recorded at a hard-rock site contained a greater proportion of high frequencies than the motions recorded at a soil site. Two of the stations continue to monitor seismic activity. A thorough field investigation of the mapped Enriquillo Fault south of the city of Leogane failed to find any evidence of surface faulting. This led the team to conclude that the earthquake was unlikely to have produced any surface rupture in the study area. Geotechnical Aspects Soil liquefaction, landslides and rockslides in cut slopes, and road embankment failures contributed to extensive damage in Port-au-Prince and elsewhere. A lack of detailed knowledge of the physical conditions of the

  19. Earthquakes: hydrogeochemical precursors

    USGS Publications Warehouse

    Ingebritsen, Steven E.; Manga, Michael

    2014-01-01

    Earthquake prediction is a long-sought goal. Changes in groundwater chemistry before earthquakes in Iceland highlight a potential hydrogeochemical precursor, but such signals must be evaluated in the context of long-term, multiparametric data sets.

  20. Earthquakes, September-October 1993

    USGS Publications Warehouse

    Person, W.J.

    1993-01-01

    The fatalities in the United States were caused by two earthquakes in southern Oregon on September 21. These earthquakes, both with magnitude 6.0 and separated in time by about 2 hrs, led to the deaths of two people. One of these deaths was apparently due to a heart attack induced by the earthquake

  1. Children's Ideas about Earthquakes

    ERIC Educational Resources Information Center

    Simsek, Canan Lacin

    2007-01-01

    Earthquake, a natural disaster, is among the fundamental problems of many countries. If people know how to protect themselves from earthquake and arrange their life styles in compliance with this, damage they will suffer will reduce to that extent. In particular, a good training regarding earthquake to be received in primary schools is considered…

  2. Excel, Earthquakes, and Moneyball: exploring Cascadia earthquake probabilities using spreadsheets and baseball analogies

    NASA Astrophysics Data System (ADS)

    Campbell, M. R.; Salditch, L.; Brooks, E. M.; Stein, S.; Spencer, B. D.

    2017-12-01

    Much recent media attention focuses on Cascadia's earthquake hazard. A widely cited magazine article starts "An earthquake will destroy a sizable portion of the coastal Northwest. The question is when." Stories include statements like "a massive earthquake is overdue", "in the next 50 years, there is a 1-in-10 chance a "really big one" will erupt," or "the odds of the big Cascadia earthquake happening in the next fifty years are roughly one in three." These lead students to ask where the quoted probabilities come from and what they mean. These probability estimates involve two primary choices: what data are used to describe when past earthquakes happened and what models are used to forecast when future earthquakes will happen. The data come from a 10,000-year record of large paleoearthquakes compiled from subsidence data on land and turbidites, offshore deposits recording submarine slope failure. Earthquakes seem to have happened in clusters of four or five events, separated by gaps. Earthquakes within a cluster occur more frequently and regularly than in the full record. Hence the next earthquake is more likely if we assume that we are in the recent cluster that started about 1700 years ago, than if we assume the cluster is over. Students can explore how changing assumptions drastically changes probability estimates using easy-to-write and display spreadsheets, like those shown below. Insight can also come from baseball analogies. The cluster issue is like deciding whether to assume that a hitter's performance in the next game is better described by his lifetime record, or by the past few games, since he may be hitting unusually well or in a slump. The other big choice is whether to assume that the probability of an earthquake is constant with time, or is small immediately after one occurs and then grows with time. This is like whether to assume that a player's performance is the same from year to year, or changes over their career. Thus saying "the chance of

  3. Practical and Efficient Searching in Proteomics: A Cross Engine Comparison.

    PubMed

    Paulo, Joao A

    2013-10-01

    Analysis of large datasets produced by mass spectrometry-based proteomics relies on database search algorithms to sequence peptides and identify proteins. Several such scoring methods are available, each based on different statistical foundations and thereby not producing identical results. Here, the aim is to compare peptide and protein identifications using multiple search engines and examine the additional proteins gained by increasing the number of technical replicate analyses. A HeLa whole cell lysate was analyzed on an Orbitrap mass spectrometer for 10 technical replicates. The data were combined and searched using Mascot, SEQUEST, and Andromeda. Comparisons were made of peptide and protein identifications among the search engines. In addition, searches using each engine were performed with incrementing number of technical replicates. The number and identity of peptides and proteins differed across search engines. For all three search engines, the differences in proteins identifications were greater than the differences in peptide identifications indicating that the major source of the disparity may be at the protein inference grouping level. The data also revealed that analysis of 2 technical replicates can increase protein identifications by up to 10-15%, while a third replicate results in an additional 4-5%. The data emphasize two practical methods of increasing the robustness of mass spectrometry data analysis. The data show that 1) using multiple search engines can expand the number of identified proteins (union) and validate protein identifications (intersection), and 2) analysis of 2 or 3 technical replicates can substantially expand protein identifications. Moreover, information can be extracted from a dataset by performing database searching with different engines and performing technical repeats, which requires no additional sample preparation and effectively utilizes research time and effort.

  4. An Earthquake Shake Map Routine with Low Cost Accelerometers: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Alcik, H. A.; Tanircan, G.; Kaya, Y.

    2015-12-01

    Vast amounts of high quality strong motion data are indispensable inputs of the analyses in the field of geotechnical and earthquake engineering however, high cost of installation of the strong motion systems constitutes the biggest obstacle for worldwide dissemination. In recent years, MEMS based (micro-electro-mechanical systems) accelerometers have been used in seismological research-oriented studies as well as earthquake engineering oriented projects basically due to precision obtained in downsized instruments. In this research our primary goal is to ensure the usage of these low-cost instruments in the creation of shake-maps immediately after a strong earthquake. Second goal is to develop software that will automatically process the real-time data coming from the rapid response network and create shake-map. For those purposes, four MEMS sensors have been set up to deliver real-time data. Data transmission is done through 3G modems. A subroutine was coded in assembler language and embedded into the operating system of each instrument to create MiniSEED files with packages of 1-second instead of 512-byte packages.The Matlab-based software calculates the strong motion (SM) parameters at every second, and they are compared with the user-defined thresholds. A voting system embedded in the software captures the event if the total vote exceeds the threshold. The user interface of the software enables users to monitor the calculated SM parameters either in a table or in a graph (Figure 1). A small scale and affordable rapid response network is created using four MEMS sensors, and the functionality of the software has been tested and validated using shake table tests. The entire system is tested together with a reference sensor under real strong ground motion recordings as well as series of sine waves with varying amplitude and frequency. The successful realization of this software allowed us to set up a test network at Tekirdağ Province, the closest coastal point to

  5. Do Earthquakes Shake Stock Markets?

    PubMed

    Ferreira, Susana; Karali, Berna

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan.

  6. Toward real-time regional earthquake simulation II: Real-time Online earthquake Simulation (ROS) of Taiwan earthquakes

    NASA Astrophysics Data System (ADS)

    Lee, Shiann-Jong; Liu, Qinya; Tromp, Jeroen; Komatitsch, Dimitri; Liang, Wen-Tzong; Huang, Bor-Shouh

    2014-06-01

    We developed a Real-time Online earthquake Simulation system (ROS) to simulate regional earthquakes in Taiwan. The ROS uses a centroid moment tensor solution of seismic events from a Real-time Moment Tensor monitoring system (RMT), which provides all the point source parameters including the event origin time, hypocentral location, moment magnitude and focal mechanism within 2 min after the occurrence of an earthquake. Then, all of the source parameters are automatically forwarded to the ROS to perform an earthquake simulation, which is based on a spectral-element method (SEM). A new island-wide, high resolution SEM mesh model is developed for the whole Taiwan in this study. We have improved SEM mesh quality by introducing a thin high-resolution mesh layer near the surface to accommodate steep and rapidly varying topography. The mesh for the shallow sedimentary basin is adjusted to reflect its complex geometry and sharp lateral velocity contrasts. The grid resolution at the surface is about 545 m, which is sufficient to resolve topography and tomography data for simulations accurate up to 1.0 Hz. The ROS is also an infrastructural service, making online earthquake simulation feasible. Users can conduct their own earthquake simulation by providing a set of source parameters through the ROS webpage. For visualization, a ShakeMovie and ShakeMap are produced during the simulation. The time needed for one event is roughly 3 min for a 70 s ground motion simulation. The ROS is operated online at the Institute of Earth Sciences, Academia Sinica (http://ros.earth.sinica.edu.tw/). Our long-term goal for the ROS system is to contribute to public earth science outreach and to realize seismic ground motion prediction in real-time.

  7. Sedimentary Signatures of Submarine Earthquakes: Deciphering the Extent of Sediment Remobilization from the 2011 Tohoku Earthquake and Tsunami and 2010 Haiti Earthquake

    NASA Astrophysics Data System (ADS)

    McHugh, C. M.; Seeber, L.; Moernaut, J.; Strasser, M.; Kanamatsu, T.; Ikehara, K.; Bopp, R.; Mustaque, S.; Usami, K.; Schwestermann, T.; Kioka, A.; Moore, L. M.

    2017-12-01

    The 2004 Sumatra-Andaman Mw9.3 and the 2011 Tohoku (Japan) Mw9.0 earthquakes and tsunamis were huge geological events with major societal consequences. Both were along subduction boundaries and ruptured portions of these boundaries that had been deemed incapable of such events. Submarine strike-slip earthquakes, such as the 2010 Mw7.0 in Haiti, are smaller but may be closer to population centers and can be similarly catastrophic. Both classes of earthquakes remobilize sediment and leave distinct signatures in the geologic record by a wide range of processes that depends on both environment and earthquake characteristics. Understanding them has the potential of greatly expanding the record of past earthquakes, which is critical for geohazard analysis. Recent events offer precious ground truth about the earthquakes and short-lived radioisotopes offer invaluable tools to identify sediments they remobilized. In the 2011 Mw9 Japan earthquake they document the spatial extent of remobilized sediment from water depths of 626m in the forearc slope to trench depths of 8000m. Subbottom profiles, multibeam bathymetry and 40 piston cores collected by the R/V Natsushima and R/V Sonne expeditions to the Japan Trench document multiple turbidites and high-density flows. Core tops enriched in xs210Pb,137Cs and 134Cs reveal sediment deposited by the 2011 Tohoku earthquake and tsunami. The thickest deposits (2m) were documented on a mid-slope terrace and trench (4000-8000m). Sediment was deposited on some terraces (600-3000m), but shed from the steep forearc slope (3000-4000m). The 2010 Haiti mainshock ruptured along the southern flank of Canal du Sud and triggered multiple nearshore sediment failures, generated turbidity currents and stirred fine sediment into suspension throughout this basin. A tsunami was modeled to stem from both sediment failures and tectonics. Remobilized sediment was tracked with short-lived radioisotopes from the nearshore, slope, in fault basins including the

  8. WGCEP Historical California Earthquake Catalog

    USGS Publications Warehouse

    Felzer, Karen R.; Cao, Tianqing

    2008-01-01

    This appendix provides an earthquake catalog for California and the surrounding area. Our goal is to provide a listing for all known M > 5.5 earthquakes that occurred from 1850-1932 and all known M > 4.0 earthquakes that occurred from 1932-2006 within the region of 31.0 to 43.0 degrees North and -126.0 to -114.0 degrees West. Some pre-1932 earthquakes 4 5, before the Northern California network was online. Some earthquakes from 1900-1932, and particularly from 1910-1932 are also based on instrumental readings, but the quality of the instrumental record and the resulting analysis are much less precise than for later listings. A partial exception is for some of the largest earthquakes, such as the San Francisco earthquake of April 18, 1906, for which global teleseismic records (Wald et al. 1993) and geodetic measurements (Thatcher et al. 1906) have been used to help determine magnitudes.

  9. Examining the Extent to Which Select Teacher Preparation Experiences Inform Technology and Engineering Educators' Teaching of Science Content and Practices

    ERIC Educational Resources Information Center

    Love, Tyler S.

    2015-01-01

    With the recent release of the "Next Generation Science Standards" (NGSS) (NGSS Lead States, 2014b) science educators were expected to teach engineering content and practices within their curricula. However, technology and engineering (T&E) educators have been expected to teach content and practices from engineering and other…

  10. ShakeCast: Automating and improving the use of shakemap for post-earthquake deeision-making and response

    USGS Publications Warehouse

    Wald, D.; Lin, K.-W.; Porter, K.; Turner, Loren

    2008-01-01

    When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, on its own can be useful for emergency response, loss estimation, and public information. However, to take full advantage of the potential of ShakeMap, we introduce ShakeCast. ShakeCast facilitates the complicated assessment of potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps of structures or facilities most likely impacted. ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users' facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for both public and private emergency managers and responders. ?? 2008, Earthquake Engineering Research Institute.

  11. Earthquakes, September-October 1978

    USGS Publications Warehouse

    Person, W.J.

    1979-01-01

    The months of September and October were somewhat quiet seismically speaking. One major earthquake, magnitude (M) 7.7 occurred in Iran on September 16. In Germany, a magntidue 5.0 earthquake caused damage and considerable alarm to many people in parts of that country. In the United States, the largest earthquake occurred along the California-Nevada border region. 

  12. Earthquakes, March-April 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    Two major earthquakes (7.0-7.9) occurred during this reporting period: a magnitude 7.6 in Costa Rica on April 22 and a magntidue 7.0 in the USSR on April 29. Destructive earthquakes hit northern Peru on April 4 and 5. There were no destructive earthquakes in the United States during this period. 

  13. Can We Predict Earthquakes?

    ScienceCinema

    Johnson, Paul

    2018-01-16

    The only thing we know for sure about earthquakes is that one will happen again very soon. Earthquakes pose a vital yet puzzling set of research questions that have confounded scientists for decades, but new ways of looking at seismic information and innovative laboratory experiments are offering tantalizing clues to what triggers earthquakes — and when.

  14. Earthquake and Schools. [Videotape].

    ERIC Educational Resources Information Center

    Federal Emergency Management Agency, Washington, DC.

    Designing schools to make them more earthquake resistant and protect children from the catastrophic collapse of the school building is discussed in this videotape. It reveals that 44 of the 50 U.S. states are vulnerable to earthquake, but most schools are structurally unprepared to take on the stresses that earthquakes exert. The cost to the…

  15. Earthquakes, September-October 1991

    USGS Publications Warehouse

    Person, W.J.

    1992-01-01

    There were two major earthquakes (7.0-7.9) during this reporting period. the first was in the Solomon Islands on October 14 and the second was in India on October 19. Earthquake-related deaths were reported in Guatemala and India. Htere were no significant earthquakes in the United States during the period covered in this report. 

  16. Practical Elements in Danish Engineering Programmes, Including the European Project Semester

    ERIC Educational Resources Information Center

    Hansen, Jorgen

    2012-01-01

    In Denmark, all engineering programmes in HE have practical elements; for instance, at Bachelor's level, an internship is an integrated part of the programme. Furthermore, Denmark has a long-established tradition of problem-based and project-organized learning, and a large part of students' projects, including their final projects, is done in…

  17. Probabilistic tsunami inundation map based on stochastic earthquake source model: A demonstration case in Macau, the South China Sea

    NASA Astrophysics Data System (ADS)

    Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert

    2017-04-01

    Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes

  18. Earthquake Forecasting System in Italy

    NASA Astrophysics Data System (ADS)

    Falcone, G.; Marzocchi, W.; Murru, M.; Taroni, M.; Faenza, L.

    2017-12-01

    In Italy, after the 2009 L'Aquila earthquake, a procedure was developed for gathering and disseminating authoritative information about the time dependence of seismic hazard to help communities prepare for a potentially destructive earthquake. The most striking time dependency of the earthquake occurrence process is the time clustering, which is particularly pronounced in time windows of days and weeks. The Operational Earthquake Forecasting (OEF) system that is developed at the Seismic Hazard Center (Centro di Pericolosità Sismica, CPS) of the Istituto Nazionale di Geofisica e Vulcanologia (INGV) is the authoritative source of seismic hazard information for Italian Civil Protection. The philosophy of the system rests on a few basic concepts: transparency, reproducibility, and testability. In particular, the transparent, reproducible, and testable earthquake forecasting system developed at CPS is based on ensemble modeling and on a rigorous testing phase. Such phase is carried out according to the guidance proposed by the Collaboratory for the Study of Earthquake Predictability (CSEP, international infrastructure aimed at evaluating quantitatively earthquake prediction and forecast models through purely prospective and reproducible experiments). In the OEF system, the two most popular short-term models were used: the Epidemic-Type Aftershock Sequences (ETAS) and the Short-Term Earthquake Probabilities (STEP). Here, we report the results from OEF's 24hour earthquake forecasting during the main phases of the 2016-2017 sequence occurred in Central Apennines (Italy).

  19. Rapid estimation of the economic consequences of global earthquakes

    USGS Publications Warehouse

    Jaiswal, Kishor; Wald, David J.

    2011-01-01

    The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is

  20. A post-Tohoku earthquake review of earthquake probabilities in the Southern Kanto District, Japan

    NASA Astrophysics Data System (ADS)

    Somerville, Paul G.

    2014-12-01

    The 2011 Mw 9.0 Tohoku earthquake generated an aftershock sequence that affected a large part of northern Honshu, and has given rise to widely divergent forecasts of changes in earthquake occurrence probabilities in northern Honshu. The objective of this review is to assess these forecasts as they relate to potential changes in the occurrence probabilities of damaging earthquakes in the Kanto Region. It is generally agreed that the 2011 Mw 9.0 Tohoku earthquake increased the stress on faults in the southern Kanto district. Toda and Stein (Geophys Res Lett 686, 40: doi:10.1002, 2013) further conclude that the probability of earthquakes in the Kanto Corridor has increased by a factor of 2.5 for the time period 11 March 2013 to 10 March 2018 in the Kanto Corridor. Estimates of earthquake probabilities in a wider region of the Southern Kanto District by Nanjo et al. (Geophys J Int, doi:10.1093, 2013) indicate that any increase in the probability of earthquakes is insignificant in this larger region. Uchida et al. (Earth Planet Sci Lett 374: 81-91, 2013) conclude that the Philippine Sea plate the extends well north of the northern margin of Tokyo Bay, inconsistent with the Kanto Fragment hypothesis of Toda et al. (Nat Geosci, 1:1-6,2008), which attributes deep earthquakes in this region, which they term the Kanto Corridor, to a broken fragment of the Pacific plate. The results of Uchida and Matsuzawa (J Geophys Res 115:B07309, 2013)support the conclusion that fault creep in southern Kanto may be slowly relaxing the stress increase caused by the Tohoku earthquake without causing more large earthquakes. Stress transfer calculations indicate a large stress transfer to the Off Boso Segment as a result of the 2011 Tohoku earthquake. However, Ozawa et al. (J Geophys Res 117:B07404, 2012) used onshore GPS measurements to infer large post-Tohoku creep on the plate interface in the Off-Boso region, and Uchida and Matsuzawa (ibid.) measured similar large creep off the Boso

  1. An integrated earthquake early warning system and its performance at schools in Taiwan

    NASA Astrophysics Data System (ADS)

    Wu, Bing-Ru; Hsiao, Nai-Chi; Lin, Pei-Yang; Hsu, Ting-Yu; Chen, Chiou-Yun; Huang, Shieh-Kung; Chiang, Hung-Wei

    2017-01-01

    An earthquake early warning (EEW) system with integration of regional and onsite approaches was installed at nine demonstration stations in several districts of Taiwan for taking advantages of both approaches. The system performance was evaluated by a 3-year experiment at schools, which experienced five major earthquakes during this period. The blind zone of warning was effectively reduced by the integrated EEW system. The predicted intensities from EEW demonstration stations showed acceptable accuracy compared to field observations. The operation experience from an earthquake event proved that students could calmly carry out correct action before the seismic wave arrived using some warning time provided by the EEW system. Through successful operation in practice, the integrated EEW system was verified as an effective tool for disaster prevention at schools.

  2. Do Earthquakes Shake Stock Markets?

    PubMed Central

    2015-01-01

    This paper examines how major earthquakes affected the returns and volatility of aggregate stock market indices in thirty-five financial markets over the last twenty years. Results show that global financial markets are resilient to shocks caused by earthquakes even if these are domestic. Our analysis reveals that, in a few instances, some macroeconomic variables and earthquake characteristics (gross domestic product per capita, trade openness, bilateral trade flows, earthquake magnitude, a tsunami indicator, distance to the epicenter, and number of fatalities) mediate the impact of earthquakes on stock market returns, resulting in a zero net effect. However, the influence of these variables is market-specific, indicating no systematic pattern across global capital markets. Results also demonstrate that stock market volatility is unaffected by earthquakes, except for Japan. PMID:26197482

  3. Knowledge, attitude and practices for design for safety: A study on civil & structural engineers.

    PubMed

    Goh, Yang Miang; Chua, Sijie

    2016-08-01

    Design for safety (DfS) (also known as prevention through design, safe design and Construction (Design and Management)) promotes early consideration of safety and health hazards during the design phase of a construction project. With early intervention, hazards can be more effectively eliminated or controlled leading to safer worksites and construction processes. DfS is practiced in many countries, including Australia, the UK, and Singapore. In Singapore, the Manpower Ministry enacted the DfS Regulations in July 2015, which will be enforced from August 2016 onwards. Due to the critical role of civil and structural (C&S) engineers during design and construction, the DfS knowledge, attitude and practices (KAP) of C&S engineers have significant impact on the successful implementation of DfS. Thus, this study aims to explore the DfS KAP of C&S engineers so as to guide further research in measuring and improving DfS KAP of designers. During the study, it was found that there is a lack of KAP studies in construction management. Therefore, this study also aims to provide useful lessons for future applications of the KAP framework in construction management research. A questionnaire was developed to assess the DfS KAP of C&S engineers. The responses provided by 43 C&S engineers were analyzed. In addition, interviews with experienced construction professionals were carried out to further understand perceptions of DfS and related issues. The results suggest that C&S engineers are supportive of DfS, but the level of DfS knowledge and practices need to be improved. More DfS guidelines and training should be made available to the engineers. To ensure that DfS can be implemented successfully, there is a need to study the contractual arrangements between clients and designers and the effectiveness of different implementation approaches for the DfS process. The questionnaire and findings in this study provided the foundation for a baseline survey with larger sample size, which is

  4. Continuing Megathrust Earthquake Potential in northern Chile after the 2014 Iquique Earthquake Sequence

    NASA Astrophysics Data System (ADS)

    Hayes, G. P.; Herman, M. W.; Barnhart, W. D.; Furlong, K. P.; Riquelme, S.; Benz, H.; Bergman, E.; Barrientos, S. E.; Earle, P. S.; Samsonov, S. V.

    2014-12-01

    The seismic gap theory, which identifies regions of elevated hazard based on a lack of recent seismicity in comparison to other portions of a fault, has successfully explained past earthquakes and is useful for qualitatively describing where future large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which until recently had not ruptured in a megathrust earthquake since a M~8.8 event in 1877. On April 1 2014, a M 8.2 earthquake occurred within this northern Chile seismic gap, offshore of the city of Iquique; the size and spatial extent of the rupture indicate it was not the earthquake that had been anticipated. Here, we present a rapid assessment of the seismotectonics of the March-April 2014 seismic sequence offshore northern Chile, including analyses of earthquake (fore- and aftershock) relocations, moment tensors, finite fault models, moment deficit calculations, and cumulative Coulomb stress transfer calculations over the duration of the sequence. This ensemble of information allows us to place the current sequence within the context of historic seismicity in the region, and to assess areas of remaining and/or elevated hazard. Our results indicate that while accumulated strain has been released for a portion of the northern Chile seismic gap, significant sections have not ruptured in almost 150 years. These observations suggest that large-to-great sized megathrust earthquakes will occur north and south of the 2014 Iquique sequence sooner than might be expected had the 2014 events ruptured the entire seismic gap.

  5. A teleseismic study of the 2002 Denali fault, Alaska, earthquake and implications for rapid strong-motion estimation

    USGS Publications Warehouse

    Ji, C.; Helmberger, D.V.; Wald, D.J.

    2004-01-01

    Slip histories for the 2002 M7.9 Denali fault, Alaska, earthquake are derived rapidly from global teleseismic waveform data. In phases, three models improve matching waveform data and recovery of rupture details. In the first model (Phase I), analogous to an automated solution, a simple fault plane is fixed based on the preliminary Harvard Centroid Moment Tensor mechanism and the epicenter provided by the Preliminary Determination of Epicenters. This model is then updated (Phase II) by implementing a more realistic fault geometry inferred from Digital Elevation Model topography and further (Phase III) by using the calibrated P-wave and SH-wave arrival times derived from modeling of the nearby 2002 M6.7 Nenana Mountain earthquake. These models are used to predict the peak ground velocity and the shaking intensity field in the fault vicinity. The procedure to estimate local strong motion could be automated and used for global real-time earthquake shaking and damage assessment. ?? 2004, Earthquake Engineering Research Institute.

  6. The Mw 7.7 Bhuj earthquake: Global lessons for earthquake hazard in intra-plate regions

    USGS Publications Warehouse

    Schweig, E.; Gomberg, J.; Petersen, M.; Ellis, M.; Bodin, P.; Mayrose, L.; Rastogi, B.K.

    2003-01-01

    The Mw 7.7 Bhuj earthquake occurred in the Kachchh District of the State of Gujarat, India on 26 January 2001, and was one of the most damaging intraplate earthquakes ever recorded. This earthquake is in many ways similar to the three great New Madrid earthquakes that occurred in the central United States in 1811-1812, An Indo-US team is studying the similarities and differences of these sequences in order to learn lessons for earthquake hazard in intraplate regions. Herein we present some preliminary conclusions from that study. Both the Kutch and New Madrid regions have rift type geotectonic setting. In both regions the strain rates are of the order of 10-9/yr and attenuation of seismic waves as inferred from observations of intensity and liquefaction are low. These strain rates predict recurrence intervals for Bhuj or New Madrid sized earthquakes of several thousand years or more. In contrast, intervals estimated from paleoseismic studies and from other independent data are significantly shorter, probably hundreds of years. All these observations together may suggest that earthquakes relax high ambient stresses that are locally concentrated by rheologic heterogeneities, rather than loading by plate-tectonic forces. The latter model generally underlies basic assumptions made in earthquake hazard assessment, that the long-term average rate of energy released by earthquakes is determined by the tectonic loading rate, which thus implies an inherent average periodicity of earthquake occurrence. Interpreting the observations in terms of the former model therefore may require re-examining the basic assumptions of hazard assessment.

  7. Earthquake precursors: spatial-temporal gravity changes before the great earthquakes in the Sichuan-Yunnan area

    NASA Astrophysics Data System (ADS)

    Zhu, Yi-Qing; Liang, Wei-Feng; Zhang, Song

    2018-01-01

    Using multiple-scale mobile gravity data in the Sichuan-Yunnan area, we systematically analyzed the relationships between spatial-temporal gravity changes and the 2014 Ludian, Yunnan Province Ms6.5 earthquake and the 2014 Kangding Ms6.3, 2013 Lushan Ms7.0, and 2008 Wenchuan Ms8.0 earthquakes in Sichuan Province. Our main results are as follows. (1) Before the occurrence of large earthquakes, gravity anomalies occur in a large area around the epicenters. The directions of gravity change gradient belts usually agree roughly with the directions of the main fault zones of the study area. Such gravity changes might reflect the increase of crustal stress, as well as the significant active tectonic movements and surface deformations along fault zones, during the period of gestation of great earthquakes. (2) Continuous significant changes of the multiple-scale gravity fields, as well as greater gravity changes with larger time scales, can be regarded as medium-range precursors of large earthquakes. The subsequent large earthquakes always occur in the area where the gravity changes greatly. (3) The spatial-temporal gravity changes are very useful in determining the epicenter of coming large earthquakes. The large gravity networks are useful to determine the general areas of coming large earthquakes. However, the local gravity networks with high spatial-temporal resolution are suitable for determining the location of epicenters. Therefore, denser gravity observation networks are necessary for better forecasts of the epicenters of large earthquakes. (4) Using gravity changes from mobile observation data, we made medium-range forecasts of the Kangding, Ludian, Lushan, and Wenchuan earthquakes, with especially successful forecasts of the location of their epicenters. Based on the above discussions, we emphasize that medium-/long-term potential for large earthquakes might exist nowadays in some areas with significant gravity anomalies in the study region. Thus, the monitoring

  8. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    NASA Astrophysics Data System (ADS)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  9. Evaluation and implementation of an improved methodology for earthquake ground response analysis : uniform treatment source, path and site effects.

    DOT National Transportation Integrated Search

    2008-12-01

    Shortly after the 1994 Northridge Earthquake, Caltrans geotechnical engineers charged with developing site-specific : response spectra for high priority California bridges initiated a research project aimed at broadening their perspective : from simp...

  10. A Language for the Analysis of Disciplinary Boundary Crossing: Insights from Engineering Problem-Solving Practice

    ERIC Educational Resources Information Center

    Wolff, Karin

    2018-01-01

    Poor graduate throughput and industry feedback on graduate inability to cope with the complex knowledge practices in twenty-first century engineering "problem solving" have placed pressure on educators to better conceptualise the theory-practice relationship, particularly in technology-dependent professions. The research draws on the…

  11. Geotechnical reconnaissance of the 2002 Denali fault, Alaska, earthquake

    USGS Publications Warehouse

    Kayen, R.; Thompson, E.; Minasian, D.; Moss, R.E.S.; Collins, B.D.; Sitar, N.; Dreger, D.; Carver, G.

    2004-01-01

    The 2002 M7.9 Denali fault earthquake resulted in 340 km of ruptures along three separate faults, causing widespread liquefaction in the fluvial deposits of the alpine valleys of the Alaska Range and eastern lowlands of the Tanana River. Areas affected by liquefaction are largely confined to Holocene alluvial deposits, man-made embankments, and backfills. Liquefaction damage, sparse surrounding the fault rupture in the western region, was abundant and severe on the eastern rivers: the Robertson, Slana, Tok, Chisana, Nabesna and Tanana Rivers. Synthetic seismograms from a kinematic source model suggest that the eastern region of the rupture zone had elevated strong-motion levels due to rupture directivity, supporting observations of elevated geotechnical damage. We use augered soil samples and shear-wave velocity profiles made with a portable apparatus for the spectral analysis of surface waves (SASW) to characterize soil properties and stiffness at liquefaction sites and three trans-Alaska pipeline pump station accelerometer locations. ?? 2004, Earthquake Engineering Research Institute.

  12. Overestimation of the earthquake hazard along the Himalaya: constraints in bracketing of medieval earthquakes from paleoseismic studies

    NASA Astrophysics Data System (ADS)

    Arora, Shreya; Malik, Javed N.

    2017-12-01

    The Himalaya is one of the most seismically active regions of the world. The occurrence of several large magnitude earthquakes viz. 1905 Kangra earthquake (Mw 7.8), 1934 Bihar-Nepal earthquake (Mw 8.2), 1950 Assam earthquake (Mw 8.4), 2005 Kashmir (Mw 7.6), and 2015 Gorkha (Mw 7.8) are the testimony to ongoing tectonic activity. In the last few decades, tremendous efforts have been made along the Himalayan arc to understand the patterns of earthquake occurrences, size, extent, and return periods. Some of the large magnitude earthquakes produced surface rupture, while some remained blind. Furthermore, due to the incompleteness of the earthquake catalogue, a very few events can be correlated with medieval earthquakes. Based on the existing paleoseismic data certainly, there exists a complexity to precisely determine the extent of surface rupture of these earthquakes and also for those events, which occurred during historic times. In this paper, we have compiled the paleo-seismological data and recalibrated the radiocarbon ages from the trenches excavated by previous workers along the entire Himalaya and compared earthquake scenario with the past. Our studies suggest that there were multiple earthquake events with overlapping surface ruptures in small patches with an average rupture length of 300 km limiting Mw 7.8-8.0 for the Himalayan arc, rather than two or three giant earthquakes rupturing the whole front. It has been identified that the large magnitude Himalayan earthquakes, such as 1905 Kangra, 1934 Bihar-Nepal, and 1950 Assam, that have occurred within a time frame of 45 years. Now, if these events are dated, there is a high possibility that within the range of ±50 years, they may be considered as the remnant of one giant earthquake rupturing the entire Himalayan arc. Therefore, leading to an overestimation of seismic hazard scenario in Himalaya.

  13. Simulating Earthquake Early Warning Systems in the Classroom as a New Approach to Teaching Earthquakes

    NASA Astrophysics Data System (ADS)

    D'Alessio, M. A.

    2010-12-01

    A discussion of P- and S-waves seems an ubiquitous part of studying earthquakes in the classroom. Textbooks from middle school through university level typically define the differences between the waves and illustrate the sense of motion. While many students successfully memorize the differences between wave types (often utilizing the first letter as a memory aide), textbooks rarely give tangible examples of how the two waves would "feel" to a person sitting on the ground. One reason for introducing the wave types is to explain how to calculate earthquake epicenters using seismograms and travel time charts -- very abstract representations of earthquakes. Even when the skill is mastered using paper-and-pencil activities or one of the excellent online interactive versions, locating an epicenter simply does not excite many of our students because it evokes little emotional impact, even in students located in earthquake-prone areas. Despite these limitations, huge numbers of students are mandated to complete the task. At the K-12 level, California requires that all students be able to locate earthquake epicenters in Grade 6; in New York, the skill is a required part of the Regent's Examination. Recent innovations in earthquake early warning systems around the globe give us the opportunity to address the same content standard, but with substantially more emotional impact on students. I outline a lesson about earthquakes focused on earthquake early warning systems. The introductory activities include video clips of actual earthquakes and emphasize the differences between the way P- and S-waves feel when they arrive (P arrives first, but is weaker). I include an introduction to the principle behind earthquake early warning (including a summary of possible uses of a few seconds warning about strong shaking) and show examples from Japan. Students go outdoors to simulate P-waves, S-waves, and occupants of two different cities who are talking to one another on cell phones

  14. Demand surge following earthquakes

    USGS Publications Warehouse

    Olsen, Anna H.

    2012-01-01

    Demand surge is understood to be a socio-economic phenomenon where repair costs for the same damage are higher after large- versus small-scale natural disasters. It has reportedly increased monetary losses by 20 to 50%. In previous work, a model for the increased costs of reconstruction labor and materials was developed for hurricanes in the Southeast United States. The model showed that labor cost increases, rather than the material component, drove the total repair cost increases, and this finding could be extended to earthquakes. A study of past large-scale disasters suggested that there may be additional explanations for demand surge. Two such explanations specific to earthquakes are the exclusion of insurance coverage for earthquake damage and possible concurrent causation of damage from an earthquake followed by fire or tsunami. Additional research into these aspects might provide a better explanation for increased monetary losses after large- vs. small-scale earthquakes.

  15. Intermediate-term earthquake prediction

    USGS Publications Warehouse

    Knopoff, L.

    1990-01-01

    The problems in predicting earthquakes have been attacked by phenomenological methods from pre-historic times to the present. The associations of presumed precursors with large earthquakes often have been remarked upon. the difficulty in identifying whether such correlations are due to some chance coincidence or are real precursors is that usually one notes the associations only in the relatively short time intervals before the large events. Only rarely, if ever, is notice taken of whether the presumed precursor is to be found in the rather long intervals that follow large earthquakes, or in fact is absent in these post-earthquake intervals. If there are enough examples, the presumed correlation fails as a precursor in the former case, while in the latter case the precursor would be verified. Unfortunately, the observer is usually not concerned with the 'uniteresting' intervals that have no large earthquakes

  16. Lessons learned from the 2016 Kumamoto earthquake: Building damages and behavior of seismically isolated buildings

    NASA Astrophysics Data System (ADS)

    Morita, Keiko; Takayama, Mineo

    2017-10-01

    Powerful earthquakes stuck Kumamoto and Oita Prefectures in Kyushu, Japan. It began with the Magnitude 6.5 foreshock at 21:26 JST 14 April, followed by the Magnitude 7.3 mainshock at 1:25 JST 16 April, 2016. The sequence earthquakes also involved more than 1700 perceptible earthquakes as of 13 June. The entire sequence was named the 2016 Kumamoto earthquake by the Japan Meteorological Agency. Thousands of buildings and many roads were damaged, and landslides occurred. The Japanese building standard law is revised in 1981. Structural damages were concentrated on buildings constructed prior to 1981. The area of Mashiki and Southern Aso were most badly affected, especially wooden houses extremely damaged. In Japan, Prof. Hideyuki Tada (title at the time) undertook research on laminated rubber bearings in 1978, and put it into practical use in 1981. The single family house at Yachiyodai, Chiba Prefecture is completed in 1983, it's the first seismically isolated building which is installed laminated rubber bearings in Japan. Afterward, this system is gradually adopted to mainly office buildings, like a research laboratory, a hospital, a computer center and other offices. In the 1994 Northridge earthquake, the 1995 Kobe earthquake and 2011 Tohoku earthquake, seismically isolated buildings improve these good performances, and recently number of the buildings have increased, mainly high risk area of earthquakes. Many people believed that Kumamoto was a low risk area. But there were 24 seismically isolated buildings in Kumamoto Prefecture at the time. The seismically isolated buildings indicated excellent performances during the earthquakes. They protected people, buildings and other important facilities from damages caused by the earthquake. The purpose of this paper is to discuss lessons learned from the 2016 Kumamoto earthquake and behavior of seismically isolated buildings in the earthquake.

  17. Earthquakes on Your Dinner Table

    NASA Astrophysics Data System (ADS)

    Alexeev, N. A.; Tape, C.; Alexeev, V. A.

    2016-12-01

    Earthquakes have interesting physics applicable to other phenomena like propagation of waves, also, they affect human lives. This study focused on three questions, how: depth, distance from epicenter and ground hardness affect earthquake strength. Experimental setup consisted of a gelatin slab to simulate crust. The slab was hit with a weight and earthquake amplitude was measured. It was found that earthquake amplitude was larger when the epicenter was deeper, which contradicts observations and probably was an artifact of the design. Earthquake strength was inversely proportional to the distance from the epicenter, which generally follows reality. Soft and medium jello were implanted into hard jello. It was found that earthquakes are stronger in softer jello, which was a result of resonant amplification in soft ground. Similar results are found in Minto Flats, where earthquakes are stronger and last longer than in the nearby hills. Earthquakes waveforms from Minto Flats showed that that the oscillations there have longer periods compared to the nearby hills with harder soil. Two gelatin pieces with identical shapes and different hardness were vibrated on a platform at varying frequencies in order to demonstrate that their resonant frequencies are statistically different. This phenomenon also occurs in Yukon Flats.

  18. Recent Mega-Thrust Tsunamigenic Earthquakes and PTHA

    NASA Astrophysics Data System (ADS)

    Lorito, S.

    2013-05-01

    The occurrence of several mega-thrust tsunamigenic earthquakes in the last decade, including but not limited to the 2004 Sumatra-Andaman, the 2010 Maule, and 2011 Tohoku earthquakes, has been a dramatic reminder of the limitations in our capability of assessing earthquake and tsunami hazard and risk. However, the increasingly high-quality geophysical observational networks allowed the retrieval of most accurate than ever models of the rupture process of mega-thrust earthquakes, thus paving the way for future improved hazard assessments. Probabilistic Tsunami Hazard Analysis (PTHA) methodology, in particular, is less mature than its seismic counterpart, PSHA. Worldwide recent research efforts of the tsunami science community allowed to start filling this gap, and to define some best practices that are being progressively employed in PTHA for different regions and coasts at threat. In the first part of my talk, I will briefly review some rupture models of recent mega-thrust earthquakes, and highlight some of their surprising features that likely result in bigger error bars associated to PTHA results. More specifically, recent events of unexpected size at a given location, and with unexpected rupture process features, posed first-order open questions which prevent the definition of an heterogeneous rupture probability along a subduction zone, despite of several recent promising results on the subduction zone seismic cycle. In the second part of the talk, I will dig a bit more into a specific ongoing effort for improving PTHA methods, in particular as regards epistemic and aleatory uncertainties determination, and the computational PTHA feasibility when considering the full assumed source variability. Only logic trees are usually explicated in PTHA studies, accounting for different possible assumptions on the source zone properties and behavior. The selection of the earthquakes to be actually modelled is then in general made on a qualitative basis or remains implicit

  19. Practical and Efficient Searching in Proteomics: A Cross Engine Comparison

    PubMed Central

    Paulo, Joao A.

    2014-01-01

    Background Analysis of large datasets produced by mass spectrometry-based proteomics relies on database search algorithms to sequence peptides and identify proteins. Several such scoring methods are available, each based on different statistical foundations and thereby not producing identical results. Here, the aim is to compare peptide and protein identifications using multiple search engines and examine the additional proteins gained by increasing the number of technical replicate analyses. Methods A HeLa whole cell lysate was analyzed on an Orbitrap mass spectrometer for 10 technical replicates. The data were combined and searched using Mascot, SEQUEST, and Andromeda. Comparisons were made of peptide and protein identifications among the search engines. In addition, searches using each engine were performed with incrementing number of technical replicates. Results The number and identity of peptides and proteins differed across search engines. For all three search engines, the differences in proteins identifications were greater than the differences in peptide identifications indicating that the major source of the disparity may be at the protein inference grouping level. The data also revealed that analysis of 2 technical replicates can increase protein identifications by up to 10-15%, while a third replicate results in an additional 4-5%. Conclusions The data emphasize two practical methods of increasing the robustness of mass spectrometry data analysis. The data show that 1) using multiple search engines can expand the number of identified proteins (union) and validate protein identifications (intersection), and 2) analysis of 2 or 3 technical replicates can substantially expand protein identifications. Moreover, information can be extracted from a dataset by performing database searching with different engines and performing technical repeats, which requires no additional sample preparation and effectively utilizes research time and effort. PMID:25346847

  20. New Developments of Computational Fluid Dynamics and Their Applications to Practical Engineering Problems

    NASA Astrophysics Data System (ADS)

    Chen, Hudong

    2001-06-01

    There have been considerable advances in Lattice Boltzmann (LB) based methods in the last decade. By now, the fundamental concept of using the approach as an alternative tool for computational fluid dynamics (CFD) has been substantially appreciated and validated in mainstream scientific research and in industrial engineering communities. Lattice Boltzmann based methods possess several major advantages: a) less numerical dissipation due to the linear Lagrange type advection operator in the Boltzmann equation; b) local dynamic interactions suitable for highly parallel processing; c) physical handling of boundary conditions for complicated geometries and accurate control of fluxes; d) microscopically consistent modeling of thermodynamics and of interface properties in complex multiphase flows. It provides a great opportunity to apply the method to practical engineering problems encountered in a wide range of industries from automotive, aerospace to chemical, biomedical, petroleum, nuclear, and others. One of the key challenges is to extend the applicability of this alternative approach to regimes of highly turbulent flows commonly encountered in practical engineering situations involving high Reynolds numbers. Over the past ten years, significant efforts have been made on this front at Exa Corporation in developing a lattice Boltzmann based commercial CFD software, PowerFLOW. It has become a useful computational tool for the simulation of turbulent aerodynamics in practical engineering problems involving extremely complex geometries and flow situations, such as in new automotive vehicle designs world wide. In this talk, we present an overall LB based algorithm concept along with certain key extensions in order to accurately handle turbulent flows involving extremely complex geometries. To demonstrate the accuracy of turbulent flow simulations, we provide a set of validation results for some well known academic benchmarks. These include straight channels, backward

  1. 2010 Chile Earthquake Aftershock Response

    NASA Astrophysics Data System (ADS)

    Barientos, Sergio

    2010-05-01

    The Mw=8.8 earthquake off the coast of Chile on 27 February 2010 is the 5th largest megathrust earthquake ever to be recorded and provides an unprecedented opportunity to advance our understanding of megathrust earthquakes and associated phenomena. The 2010 Chile earthquake ruptured the Concepcion-Constitucion segment of the Nazca/South America plate boundary, south of the Central Chile region and triggered a tsunami along the coast. Following the 2010 earthquake, a very energetic aftershock sequence is being observed in an area that is 600 km along strike from Valparaiso to 150 km south of Concepcion. Within the first three weeks there were over 260 aftershocks with magnitude 5.0 or greater and 18 with magnitude 6.0 or greater (NEIC, USGS). The Concepcion-Constitucion segment lies immediately north of the rupture zone associated with the great magnitude 9.5 Chile earthquake, and south of the 1906 and the 1985 Valparaiso earthquakes. The last great subduction earthquake in the region dates back to the February 1835 event described by Darwin (1871). Since 1835, part of the region was affected in the north by the Talca earthquake in December 1928, interpreted as a shallow dipping thrust event, and by the Chillan earthquake (Mw 7.9, January 1939), a slab-pull intermediate depth earthquake. For the last 30 years, geodetic studies in this area were consistent with a fully coupled elastic loading of the subduction interface at depth; this led to identify the area as a mature seismic gap with potential for an earthquake of magnitude of the order 8.5 or several earthquakes of lesser magnitude. What was less expected was the partial rupturing of the 1985 segment toward north. Today, the 2010 earthquake raises some disturbing questions: Why and how the rupture terminated where it did at the northern end? How did the 2010 earthquake load the adjacent segment to the north and did the 1985 earthquake only partially ruptured the plate interface leaving loaded asperities since

  2. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  3. Sensing the earthquake

    NASA Astrophysics Data System (ADS)

    Bichisao, Marta; Stallone, Angela

    2017-04-01

    Making science visual plays a crucial role in the process of building knowledge. In this view, art can considerably facilitate the representation of the scientific content, by offering a different perspective on how a specific problem could be approached. Here we explore the possibility of presenting the earthquake process through visual dance. From a choreographer's point of view, the focus is always on the dynamic relationships between moving objects. The observed spatial patterns (coincidences, repetitions, double and rhythmic configurations) suggest how objects organize themselves in the environment and what are the principles underlying that organization. The identified set of rules is then implemented as a basis for the creation of a complex rhythmic and visual dance system. Recently, scientists have turned seismic waves into sound and animations, introducing the possibility of "feeling" the earthquakes. We try to implement these results into a choreographic model with the aim to convert earthquake sound to a visual dance system, which could return a transmedia representation of the earthquake process. In particular, we focus on a possible method to translate and transfer the metric language of seismic sound and animations into body language. The objective is to involve the audience into a multisensory exploration of the earthquake phenomenon, through the stimulation of the hearing, eyesight and perception of the movements (neuromotor system). In essence, the main goal of this work is to develop a method for a simultaneous visual and auditory representation of a seismic event by means of a structured choreographic model. This artistic representation could provide an original entryway into the physics of earthquakes.

  4. Discrepancy between earthquake rates implied by historic earthquakes and a consensus geologic source model for California

    USGS Publications Warehouse

    Petersen, M.D.; Cramer, C.H.; Reichle, M.S.; Frankel, A.D.; Hanks, T.C.

    2000-01-01

    We examine the difference between expected earthquake rates inferred from the historical earthquake catalog and the geologic data that was used to develop the consensus seismic source characterization for the state of California [California Department of Conservation, Division of Mines and Geology (CDMG) and U.S. Geological Survey (USGS) Petersen et al., 1996; Frankel et al., 1996]. On average the historic earthquake catalog and the seismic source model both indicate about one M 6 or greater earthquake per year in the state of California. However, the overall earthquake rates of earthquakes with magnitudes (M) between 6 and 7 in this seismic source model are higher, by at least a factor of 2, than the mean historic earthquake rates for both southern and northern California. The earthquake rate discrepancy results from a seismic source model that includes earthquakes with characteristic (maximum) magnitudes that are primarily between M 6.4 and 7.1. Many of these faults are interpreted to accommodate high strain rates from geologic and geodetic data but have not ruptured in large earthquakes during historic time. Our sensitivity study indicates that the rate differences between magnitudes 6 and 7 can be reduced by adjusting the magnitude-frequency distribution of the source model to reflect more characteristic behavior, by decreasing the moment rate available for seismogenic slip along faults, by increasing the maximum magnitude of the earthquake on a fault, or by decreasing the maximum magnitude of the background seismicity. However, no single parameter can be adjusted, consistent with scientific consensus, to eliminate the earthquake rate discrepancy. Applying a combination of these parametric adjustments yields an alternative earthquake source model that is more compatible with the historic data. The 475-year return period hazard for peak ground and 1-sec spectral acceleration resulting from this alternative source model differs from the hazard resulting from the

  5. Earthquake scenario and probabilistic ground-shaking hazard maps for the Albuquerque-Belen-Santa Fe, New Mexico, corridor

    USGS Publications Warehouse

    Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.

    2004-01-01

    These maps are not intended to be a substitute for site-specific studies for engineering design nor to replace standard maps commonly referenced in building codes. Rather, we hope that these maps will be used as a guide by government agencies; the engineering, urban planning, emergency preparedness, and response communities; and the general public as part of an overall program to reduce earthquake risk and losses in New Mexico.

  6. The Future of Metabolic Engineering and Synthetic Biology: Towards a Systematic Practice

    PubMed Central

    Yadav, Vikramaditya G.; De Mey, Marjan; Lim, Chin Giaw; Ajikumar, Parayil Kumaran; Stephanopoulos, Gregory

    2012-01-01

    Industrial biotechnology promises to revolutionize conventional chemical manufacturing in the years ahead, largely owing to the excellent progress in our ability to re-engineer cellular metabolism. However, most successes of metabolic engineering have been confined to over-producing natively synthesized metabolites in E. coli and S. cerevisiae. A major reason for this development has been the descent of metabolic engineering, particularly secondary metabolic engineering, to a collection of demonstrations rather than a systematic practice with generalizable tools. Synthetic biology, a more recent development, faces similar criticisms. Herein, we attempt to lay down a framework around which bioreaction engineering can systematize itself just like chemical reaction engineering. Central to this undertaking is a new approach to engineering secondary metabolism known as ‘multivariate modular metabolic engineering’ (MMME), whose novelty lies in its assessment and elimination of regulatory and pathway bottlenecks by re-defining the metabolic network as a collection of distinct modules. After introducing the core principles of MMME, we shall then present a number of recent developments in secondary metabolic engineering that could potentially serve as its facilitators. It is hoped that the ever-declining costs of de novo gene synthesis; the improved use of bioinformatic tools to mine, sort and analyze biological data; and the increasing sensitivity and sophistication of investigational tools will make the maturation of microbial metabolic engineering an autocatalytic process. Encouraged by these advances, research groups across the world would take up the challenge of secondary metabolite production in simple hosts with renewed vigor, thereby adding to the range of products synthesized using metabolic engineering. PMID:22629571

  7. Earthquake Fingerprints: Representing Earthquake Waveforms for Similarity-Based Detection

    NASA Astrophysics Data System (ADS)

    Bergen, K.; Beroza, G. C.

    2016-12-01

    New earthquake detection methods, such as Fingerprint and Similarity Thresholding (FAST), use fast approximate similarity search to identify similar waveforms in long-duration data without templates (Yoon et al. 2015). These methods have two key components: fingerprint extraction and an efficient search algorithm. Fingerprint extraction converts waveforms into fingerprints, compact signatures that represent short-duration waveforms for identification and search. Earthquakes are detected using an efficient indexing and search scheme, such as locality-sensitive hashing, that identifies similar waveforms in a fingerprint database. The quality of the search results, and thus the earthquake detection results, is strongly dependent on the fingerprinting scheme. Fingerprint extraction should map similar earthquake waveforms to similar waveform fingerprints to ensure a high detection rate, even under additive noise and small distortions. Additionally, fingerprints corresponding to noise intervals should have mutually dissimilar fingerprints to minimize false detections. In this work, we compare the performance of multiple fingerprint extraction approaches for the earthquake waveform similarity search problem. We apply existing audio fingerprinting (used in content-based audio identification systems) and time series indexing techniques and present modified versions that are specifically adapted for seismic data. We also explore data-driven fingerprinting approaches that can take advantage of labeled or unlabeled waveform data. For each fingerprinting approach we measure its ability to identify similar waveforms in a low signal-to-noise setting, and quantify the trade-off between true and false detection rates in the presence of persistent noise sources. We compare the performance using known event waveforms from eight independent stations in the Northern California Seismic Network.

  8. Earthquake-related versus non-earthquake-related injuries in spinal injury patients: differentiation with multidetector computed tomography

    PubMed Central

    2010-01-01

    Introduction In recent years, several massive earthquakes have occurred across the globe. Multidetector computed tomography (MDCT) is reliable in detecting spinal injuries. The purpose of this study was to compare the features of spinal injuries resulting from the Sichuan earthquake with those of non-earthquake-related spinal trauma using MDCT. Methods Features of spinal injuries of 223 Sichuan earthquake-exposed patients and 223 non-earthquake-related spinal injury patients were retrospectively compared using MDCT. The date of non-earthquake-related spinal injury patients was collected from 1 May 2009 to 22 July 2009 to avoid the confounding effects of seasonal activity and clothing. We focused on anatomic sites, injury types and neurologic deficits related to spinal injuries. Major injuries were classified according to the grid 3-3-3 scheme of the Magerl (AO) classification system. Results A total of 185 patients (82.96%) in the earthquake-exposed cohort experienced crush injuries. In the earthquake and control groups, 65 and 92 patients, respectively, had neurologic deficits. The anatomic distribution of these two cohorts was significantly different (P < 0.001). Cervical spinal injuries were more common in the control group (risk ratio (RR) = 2.12, P < 0.001), whereas lumbar spinal injuries were more common in the earthquake-related spinal injuries group (277 of 501 injured vertebrae; 55.29%). The major types of injuries were significantly different between these cohorts (P = 0.002). Magerl AO type A lesions composed most of the lesions seen in both of these cohorts. Type B lesions were more frequently seen in earthquake-related spinal injuries (RR = 1.27), while we observed type C lesions more frequently in subjects with non-earthquake-related spinal injuries (RR = 1.98, P = 0.0029). Conclusions Spinal injuries sustained in the Sichuan earthquake were located mainly in the lumbar spine, with a peak prevalence of type A lesions and a high occurrence of

  9. Tweeting Earthquakes using TensorFlow

    NASA Astrophysics Data System (ADS)

    Casarotti, E.; Comunello, F.; Magnoni, F.

    2016-12-01

    The use of social media is emerging as a powerful tool for disseminating trusted information about earthquakes. Since 2009, the Twitter account @INGVterremoti provides constant and timely details about M2+ seismic events detected by the Italian National Seismic Network, directly connected with the seismologists on duty at Istituto Nazionale di Geofisica e Vulcanologia (INGV). Currently, it updates more than 150,000 followers. Nevertheless, since it provides only the manual revision of seismic parameters, the timing (approximately between 10 and 20 minutes after an event) has started to be under evaluation. Undeniably, mobile internet, social network sites and Twitter in particular require a more rapid and "real-time" reaction. During the last 36 months, INGV tested the tweeting of the automatic detection of M3+ earthquakes, studying the reliability of the information both in term of seismological accuracy that from the point of view of communication and social research. A set of quality parameters (i.e. number of seismic stations, gap, relative error of the location) has been recognized to reduce false alarms and the uncertainty of the automatic detection. We present an experiment to further improve the reliability of this process using TensorFlow™ (an open source software library originally developed by researchers and engineers working on the Google Brain Team within Google's Machine Intelligence research organization).

  10. Earthquake Emergency Education in Dushanbe, Tajikistan

    ERIC Educational Resources Information Center

    Mohadjer, Solmaz; Bendick, Rebecca; Halvorson, Sarah J.; Saydullaev, Umed; Hojiboev, Orifjon; Stickler, Christine; Adam, Zachary R.

    2010-01-01

    We developed a middle school earthquake science and hazards curriculum to promote earthquake awareness to students in the Central Asian country of Tajikistan. These materials include pre- and post-assessment activities, six science activities describing physical processes related to earthquakes, five activities on earthquake hazards and mitigation…

  11. Intensity earthquake scenario (scenario event - a damaging earthquake with higher probability of occurrence) for the city of Sofia

    NASA Astrophysics Data System (ADS)

    Aleksandrova, Irena; Simeonova, Stela; Solakov, Dimcho; Popova, Maria

    2014-05-01

    Among the many kinds of natural and man-made disasters, earthquakes dominate with regard to their social and economical impact on the urban environment. Global seismic risk to earthquakes are increasing steadily as urbanization and development occupy more areas that a prone to effects of strong earthquakes. Additionally, the uncontrolled growth of mega cities in highly seismic areas around the world is often associated with the construction of seismically unsafe buildings and infrastructures, and undertaken with an insufficient knowledge of the regional seismicity peculiarities and seismic hazard. The assessment of seismic hazard and generation of earthquake scenarios is the first link in the prevention chain and the first step in the evaluation of the seismic risk. The earthquake scenarios are intended as a basic input for developing detailed earthquake damage scenarios for the cities and can be used in earthquake-safe town and infrastructure planning. The city of Sofia is the capital of Bulgaria. It is situated in the centre of the Sofia area that is the most populated (the population is of more than 1.2 mil. inhabitants), industrial and cultural region of Bulgaria that faces considerable earthquake risk. The available historical documents prove the occurrence of destructive earthquakes during the 15th-18th centuries in the Sofia zone. In 19th century the city of Sofia has experienced two strong earthquakes: the 1818 earthquake with epicentral intensity I0=8-9 MSK and the 1858 earthquake with I0=9-10 MSK. During the 20th century the strongest event occurred in the vicinity of the city of Sofia is the 1917 earthquake with MS=5.3 (I0=7-8 MSK). Almost a century later (95 years) an earthquake of moment magnitude 5.6 (I0=7-8 MSK) hit the city of Sofia, on May 22nd, 2012. In the present study as a deterministic scenario event is considered a damaging earthquake with higher probability of occurrence that could affect the city with intensity less than or equal to VIII

  12. Earthquake recurrence models fail when earthquakes fail to reset the stress field

    USGS Publications Warehouse

    Tormann, Thessa; Wiemer, Stefan; Hardebeck, Jeanne L.

    2012-01-01

    Parkfield's regularly occurring M6 mainshocks, about every 25 years, have over two decades stoked seismologists' hopes to successfully predict an earthquake of significant size. However, with the longest known inter-event time of 38 years, the latest M6 in the series (28 Sep 2004) did not conform to any of the applied forecast models, questioning once more the predictability of earthquakes in general. Our study investigates the spatial pattern of b-values along the Parkfield segment through the seismic cycle and documents a stably stressed structure. The forecasted rate of M6 earthquakes based on Parkfield's microseismicity b-values corresponds well to observed rates. We interpret the observed b-value stability in terms of the evolution of the stress field in that area: the M6 Parkfield earthquakes do not fully unload the stress on the fault, explaining why time recurrent models fail. We present the 1989 M6.9 Loma Prieta earthquake as counter example, which did release a significant portion of the stress along its fault segment and yields a substantial change in b-values.

  13. Converting Advances in Seismology into Earthquake Science

    NASA Astrophysics Data System (ADS)

    Hauksson, Egill; Shearer, Peter; Vidale, John

    2004-01-01

    Federal and state agencies and university groups all operate seismic networks in California. The U.S. Geological Survey (USGS) operates seismic networks in California in cooperation with the California Institute of Technology (Caltech) in southern California, and the University of California (UC) at Berkeley in northern California. The California Geological Survey (CGS) and the USGS National Strong Motion Program (NSMP) operate dial-out strong motion instruments in the state, primarily to capture data from large earthquakes for earthquake engineering and, more recently, emergency response. The California Governor's Office of Emergency Services (OES) provides leadership for the most recent project, the California Integrated Seismic Network (CISN), to integrate all of the California efforts, and to take advantage of the emergency response capabilities of the seismic networks. The core members of the CISN are Caltech, UC Berkeley, CGS, USGS Menlo Park, and USGS Pasadena (http://www.cisn.org). New seismic instrumentation is in place across southern California, and significant progress has been made in improving instrumentation in northern California. Since 2001, these new field instrumentation efforts, data sharing, and software development for real-time reporting and archiving have been coordinated through the California Integrated Seismic Network (CISN). The CISN is also the California region of the Advanced National Seismic Network (ANSS). In addition, EarthScope deployments of USArray that will begin in early 2004 in California are coordinated with the CISN. The southern and northern California earthquake data centers (SCEDC and NCEDC) have new capabilities that enable seismologists to obtain large volumes of data with only modest effort.

  14. The Virtual Quake Earthquake Simulator: Earthquake Probability Statistics for the El Mayor-Cucapah Region and Evidence of Predictability in Simulated Earthquake Sequences

    NASA Astrophysics Data System (ADS)

    Schultz, K.; Yoder, M. R.; Heien, E. M.; Rundle, J. B.; Turcotte, D. L.; Parker, J. W.; Donnellan, A.

    2015-12-01

    We introduce a framework for developing earthquake forecasts using Virtual Quake (VQ), the generalized successor to the perhaps better known Virtual California (VC) earthquake simulator. We discuss the basic merits and mechanics of the simulator, and we present several statistics of interest for earthquake forecasting. We also show that, though the system as a whole (in aggregate) behaves quite randomly, (simulated) earthquake sequences limited to specific fault sections exhibit measurable predictability in the form of increasing seismicity precursory to large m > 7 earthquakes. In order to quantify this, we develop an alert based forecasting metric similar to those presented in Keilis-Borok (2002); Molchan (1997), and show that it exhibits significant information gain compared to random forecasts. We also discuss the long standing question of activation vs quiescent type earthquake triggering. We show that VQ exhibits both behaviors separately for independent fault sections; some fault sections exhibit activation type triggering, while others are better characterized by quiescent type triggering. We discuss these aspects of VQ specifically with respect to faults in the Salton Basin and near the El Mayor-Cucapah region in southern California USA and northern Baja California Norte, Mexico.

  15. Modelling Psychological Responses to the Great East Japan Earthquake and Nuclear Incident

    PubMed Central

    Goodwin, Robin; Takahashi, Masahito; Sun, Shaojing; Gaines, Stanley O.

    2012-01-01

    The Great East Japan (Tōhoku/Kanto) earthquake of March 2011was followed by a major tsunami and nuclear incident. Several previous studies have suggested a number of psychological responses to such disasters. However, few previous studies have modelled individual differences in the risk perceptions of major events, or the implications of these perceptions for relevant behaviours. We conducted a survey specifically examining responses to the Great Japan earthquake and nuclear incident, with data collected 11–13 weeks following these events. 844 young respondents completed a questionnaire in three regions of Japan; Miyagi (close to the earthquake and leaking nuclear plants), Tokyo/Chiba (approximately 220 km from the nuclear plants), and Western Japan (Yamaguchi and Nagasaki, some 1000 km from the plants). Results indicated significant regional differences in risk perception, with greater concern over earthquake risks in Tokyo than in Miyagi or Western Japan. Structural equation analyses showed that shared normative concerns about earthquake and nuclear risks, conservation values, lack of trust in governmental advice about the nuclear hazard, and poor personal control over the nuclear incident were positively correlated with perceived earthquake and nuclear risks. These risk perceptions further predicted specific outcomes (e.g. modifying homes, avoiding going outside, contemplating leaving Japan). The strength and significance of these pathways varied by region. Mental health and practical implications of these findings are discussed in the light of the continuing uncertainties in Japan following the March 2011 events. PMID:22666380

  16. Earthquake Damage Assessment Using Objective Image Segmentation: A Case Study of 2010 Haiti Earthquake

    NASA Technical Reports Server (NTRS)

    Oommen, Thomas; Rebbapragada, Umaa; Cerminaro, Daniel

    2012-01-01

    In this study, we perform a case study on imagery from the Haiti earthquake that evaluates a novel object-based approach for characterizing earthquake induced surface effects of liquefaction against a traditional pixel based change technique. Our technique, which combines object-oriented change detection with discriminant/categorical functions, shows the power of distinguishing earthquake-induced surface effects from changes in buildings using the object properties concavity, convexity, orthogonality and rectangularity. Our results suggest that object-based analysis holds promise in automatically extracting earthquake-induced damages from high-resolution aerial/satellite imagery.

  17. Rupture, waves and earthquakes.

    PubMed

    Uenishi, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but "extraordinary" phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable.

  18. Rupture, waves and earthquakes

    PubMed Central

    UENISHI, Koji

    2017-01-01

    Normally, an earthquake is considered as a phenomenon of wave energy radiation by rupture (fracture) of solid Earth. However, the physics of dynamic process around seismic sources, which may play a crucial role in the occurrence of earthquakes and generation of strong waves, has not been fully understood yet. Instead, much of former investigation in seismology evaluated earthquake characteristics in terms of kinematics that does not directly treat such dynamic aspects and usually excludes the influence of high-frequency wave components over 1 Hz. There are countless valuable research outcomes obtained through this kinematics-based approach, but “extraordinary” phenomena that are difficult to be explained by this conventional description have been found, for instance, on the occasion of the 1995 Hyogo-ken Nanbu, Japan, earthquake, and more detailed study on rupture and wave dynamics, namely, possible mechanical characteristics of (1) rupture development around seismic sources, (2) earthquake-induced structural failures and (3) wave interaction that connects rupture (1) and failures (2), would be indispensable. PMID:28077808

  19. The CATDAT damaging earthquakes database

    NASA Astrophysics Data System (ADS)

    Daniell, J. E.; Khazai, B.; Wenzel, F.; Vervaeck, A.

    2011-08-01

    The global CATDAT damaging earthquakes and secondary effects (tsunami, fire, landslides, liquefaction and fault rupture) database was developed to validate, remove discrepancies, and expand greatly upon existing global databases; and to better understand the trends in vulnerability, exposure, and possible future impacts of such historic earthquakes. Lack of consistency and errors in other earthquake loss databases frequently cited and used in analyses was a major shortcoming in the view of the authors which needed to be improved upon. Over 17 000 sources of information have been utilised, primarily in the last few years, to present data from over 12 200 damaging earthquakes historically, with over 7000 earthquakes since 1900 examined and validated before insertion into the database. Each validated earthquake includes seismological information, building damage, ranges of social losses to account for varying sources (deaths, injuries, homeless, and affected), and economic losses (direct, indirect, aid, and insured). Globally, a slightly increasing trend in economic damage due to earthquakes is not consistent with the greatly increasing exposure. The 1923 Great Kanto (214 billion USD damage; 2011 HNDECI-adjusted dollars) compared to the 2011 Tohoku (>300 billion USD at time of writing), 2008 Sichuan and 1995 Kobe earthquakes show the increasing concern for economic loss in urban areas as the trend should be expected to increase. Many economic and social loss values not reported in existing databases have been collected. Historical GDP (Gross Domestic Product), exchange rate, wage information, population, HDI (Human Development Index), and insurance information have been collected globally to form comparisons. This catalogue is the largest known cross-checked global historic damaging earthquake database and should have far-reaching consequences for earthquake loss estimation, socio-economic analysis, and the global reinsurance field.

  20. Bayesian historical earthquake relocation: an example from the 1909 Taipei earthquake

    USGS Publications Warehouse

    Minson, Sarah E.; Lee, William H.K.

    2014-01-01

    Locating earthquakes from the beginning of the modern instrumental period is complicated by the fact that there are few good-quality seismograms and what traveltimes do exist may be corrupted by both large phase-pick errors and clock errors. Here, we outline a Bayesian approach to simultaneous inference of not only the hypocentre location but also the clock errors at each station and the origin time of the earthquake. This methodology improves the solution for the source location and also provides an uncertainty analysis on all of the parameters included in the inversion. As an example, we applied this Bayesian approach to the well-studied 1909 Mw 7 Taipei earthquake. While our epicentre location and origin time for the 1909 Taipei earthquake are consistent with earlier studies, our focal depth is significantly shallower suggesting a higher seismic hazard to the populous Taipei metropolitan area than previously supposed.