Important Earthquake Engineering Resources
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
MCEER, from Earthquake Engineering to Extreme Events | Home Page
Center Report Series Education Education Home Bridge Engineering Guest Speaker Series Connecte²d Teaching CSEE Graduate Student Poster Competition Earthquake Engineering Education in Haiti Earthquakes : FAQ's Engineering Seminar Series K-12 STEM Education National Engineers Week Research Experiences for
The Electronic Encyclopedia of Earthquakes
NASA Astrophysics Data System (ADS)
Benthien, M.; Marquis, J.; Jordan, T.
2003-12-01
The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will complete 450 entries, which will populate the E3 collection to a level that fully spans earthquake science and engineering. Scientists, engineers, and educators who have suggestions for content to be included in the Encyclopedia can visit www.earthquake.info now to complete the "Suggest a Web Page" form.
NASA Astrophysics Data System (ADS)
Baytiyeh, Hoda; Naja, Mohamad K.
2014-09-01
Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enrol on engineering courses through lenient admission policies that do not compromise academic standards. This strategy has generated an influx of students who must be carefully educated to enhance their professional knowledge and social capital to assist in future earthquake-disaster risk-reduction efforts. However, the majority of Middle Eastern engineering students are unaware of the valuable acquired engineering skills and knowledge in building the resilience of their communities to earthquake disasters. As the majority of the countries in the Middle East are exposed to seismic hazards and are vulnerable to destructive earthquakes, engineers have become indispensable assets and the first line of defence against earthquake threats. This article highlights the contributions of some of the engineering innovations in advancing technologies and techniques for effective disaster mitigation and it calls for the incorporation of earthquake-disaster-mitigation education into academic engineering programmes in the Eastern Mediterranean region.
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Site Map Search Frequently Asked Questions What is the Pacific Earthquake Engineering Research Center ? The Pacific Earthquake Engineering Research Center (PEER) is a multidisciplinary research and
75 FR 65385 - Agency Information Collection Activities: Proposed Collection; Comment Request
Federal Register 2010, 2011, 2012, 2013, 2014
2010-10-22
... Earthquake Engineering Simulation (NEES). SUMMARY: In compliance with the requirement of section 3506(c)(2)(A... of the Network for Earthquake Engineering Simulation. Type of Information Collection Request: New... inform decision making regarding the future of NSF support for earthquake engineering research...
Real-time earthquake monitoring using a search engine method.
Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong
2014-12-04
When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.
The quest for better quality-of-life - learning from large-scale shaking table tests
NASA Astrophysics Data System (ADS)
Nakashima, M.; Sato, E.; Nagae, T.; Kunio, F.; Takahito, I.
2010-12-01
Earthquake engineering has its origins in the practice of “learning from actual earthquakes and earthquake damages.” That is, we recognize serious problems by witnessing the actual damage to our structures, and then we develop and apply engineering solutions to solve these problems. This tradition in earthquake engineering, i.e., “learning from actual damage,” was an obvious engineering response to earthquakes and arose naturally as a practice in a civil and building engineering discipline that traditionally places more emphasis on experience than do other engineering disciplines. But with the rapid progress of urbanization, as society becomes denser, and as the many components that form our society interact with increasing complexity, the potential damage with which earthquakes threaten the society also increases. In such an era, the approach of ”learning from actual earthquake damages” becomes unacceptably dangerous and expensive. Among the practical alternatives to the old practice is to “learn from quasi-actual earthquake damages.” One tool for experiencing earthquake damages without attendant catastrophe is the large shaking table. E-Defense, the largest one we have, was developed in Japan after the 1995 Hyogoken-Nanbu (Kobe) earthquake. Since its inauguration in 2005, E-Defense has conducted over forty full-scale or large-scale shaking table tests, applied to a variety of structural systems. The tests supply detailed data on actual behavior and collapse of the tested structures, offering the earthquake engineering community opportunities to experience and assess the actual seismic performance of the structures, and to help society prepare for earthquakes. Notably, the data were obtained without having to wait for the aftermaths of actual earthquakes. Earthquake engineering has always been about life safety, but in recent years maintaining the quality of life has also become a critical issue. Quality-of-life concerns include nonstructural damage, business continuity, public health, quickness of damage assessment, infrastructure, data and communication networks, and other issues, and not enough useful empirical data have emerged about these issues from the experiences of actual earthquakes. To provide quantitative data that can be used to reduce earthquake risk to our quality of life, E-Defense recently has been implementing two comprehensive research projects in which a base-isolated hospital and a steel high-rise building were tested using the E-Defense shaking table and their seismic performance were examined particularly in terms of the nonstructural damage, damage to building contents and furniture, and operability, functionality, and business-continuity capability. The paper presents the overview of the two projects, together with major findings obtained from the projects.
2008-01-01
earthquake prediction in the New Madrid fault area near the Olmsted project.32 LOCAL FLOOD REDUCTION TESTS While...113 Neichter, Patrick, 91 New Albany, IN, 127, 145 Newark Air Force Base, 32, 43 , 47 Newburgh Locks and Dam, xi, 91 New Madrid earthquake , 20... Madrid earthquake , the greatest earthquake in American history, more powerful than the 1906 San Francisco earthquake . The 1811 earthquake had caused
ERIC Educational Resources Information Center
English, Lyn D.; King, Donna; Smeed, Joanna
2017-01-01
As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…
Real-time earthquake monitoring using a search engine method
Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong
2014-01-01
When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861
Introduction: seismology and earthquake engineering in Mexico and Central and South America.
Espinosa, A.F.
1982-01-01
The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston
DOE Office of Scientific and Technical Information (OSTI.GOV)
Saragoni, G. Rodolfo
The recent commemoration of the centennial of the San Francisco and Valparaiso 1906 earthquakes has given the opportunity to reanalyze their damages from modern earthquake engineering perspective. These two earthquakes plus Messina Reggio Calabria 1908 had a strong impact in the birth and developing of earthquake engineering. The study of the seismic performance of some up today existing buildings, that survive centennial earthquakes, represent a challenge to better understand the limitations of our in use earthquake design methods. Only Valparaiso 1906 earthquake, of the three considered centennial earthquakes, has been repeated again as the Central Chile, 1985, Ms = 7.8more » earthquake. In this paper a comparative study of the damage produced by 1906 and 1985 Valparaiso earthquakes is done in the neighborhood of Valparaiso harbor. In this study the only three centennial buildings of 3 stories that survived both earthquakes almost undamaged were identified. Since for 1985 earthquake accelerogram at El Almendral soil conditions as well as in rock were recoded, the vulnerability analysis of these building is done considering instrumental measurements of the demand. The study concludes that good performance of these buildings in the epicentral zone of large earthquakes can not be well explained by modern earthquake engineering methods. Therefore, it is recommended to use in the future of more suitable instrumental parameters, such as the destructiveness potential factor, to describe earthquake demand.« less
Post-Earthquake Assessment of Nevada Bridges Using ShakeMap/ShakeCast
DOT National Transportation Integrated Search
2016-01-01
Post-earthquake capacity of Nevada highway bridges is examined through a combination of engineering study and scenario earthquake evaluation. The study was undertaken by the University of Nevada Reno Department of Civil and Environmental Engineering ...
De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony
2010-09-13
When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-03
...: Survey of Principal Investigators on Earthquake Engineering Research Awards Made by the National Science... survey of Principal Investigators on NSF earthquake engineering research awards, including but not... NATIONAL SCIENCE FOUNDATION Submission for OMB Review; Comment Request Survey of Principal...
Seismic design and engineering research at the U.S. Geological Survey
1988-01-01
The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion. Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.
77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-19
... is to discuss engineering needs for existing buildings, to review the National Earthquake Hazards... Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov/ . DATES: The... assesses: Trends and developments in the science and engineering of earthquake hazards reduction; The...
The Lice, Turkey, earthquake of September 6, 1975; a preliminary engineering investigation
Yanev, P. I.
1976-01-01
The Fifth European Conference on Earthquake Engineering was held on September 22 through 25 in Istanbul, Turkey. The opening speech by the Honorable H. E. Nurettin Ok, Minister of Reconstruction and Resettlement of Turkey, introduced the several hundred delegates to the realities of earthquake hazards in Turkey:
U.S. Geological Survey National Strong-Motion Project strategic plan, 2017–22
Aagaard, Brad T.; Celebi, Mehmet; Gee, Lind; Graves, Robert; Jaiswal, Kishor; Kalkan, Erol; Knudsen, Keith L.; Luco, Nicolas; Smith, James; Steidl, Jamison; Stephens, Christopher D.
2017-12-11
The mission of the National Strong-Motion Project is to provide measurements of how the ground and built environment behave during earthquake shaking to the earthquake engineering community, the scientific community, emergency managers, public agencies, industry, media, and other users for the following purposes: Improving engineering evaluations and design methods for facilities and systems;Providing timely information for earthquake early warning, damage assessment, and emergency response action; andContributing to a greater understanding of the mechanics of earthquake rupture, groundmotion characteristics, and earthquake effects.
Publications - RI 2015-5 | Alaska Division of Geological & Geophysical
data 7.5 M Metadata - Read me Keywords Active Fault; Akutan; Coastal; Dutch Harbor; Earthquake ; Earthquake Related Slope Failure; Emergency Preparedness; Engineering; Engineering Geology; Fault
Living on an Active Earth: Perspectives on Earthquake Science
NASA Astrophysics Data System (ADS)
Lay, Thorne
2004-02-01
The annualized long-term loss due to earthquakes in the United States is now estimated at $4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of $2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.
Engineering models for catastrophe risk and their application to insurance
NASA Astrophysics Data System (ADS)
Dong, Weimin
2002-06-01
Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.
Welcome to Pacific Earthquake Engineering Research Center - PEER
Triggering and Effects at Silty Soil Sites" - PEER Research Project Highlight: "Dissipative Base ; Upcoming Events More June 10-13, 2018 Geotechnical Earthquake Engineering and Soil Dynamics V 2018 - Call
NASA Astrophysics Data System (ADS)
Li, Zongchao; Chen, Xueliang; Gao, Mengtan; Jiang, Han; Li, Tiefei
2017-03-01
Earthquake engineering parameters are very important in the engineering field, especially engineering anti-seismic design and earthquake disaster prevention. In this study, we focus on simulating earthquake engineering parameters by the empirical Green's function method. The simulated earthquake (MJMA6.5) occurred in Kyushu, Japan, 1997. Horizontal ground motion is separated as fault parallel and fault normal, in order to assess characteristics of two new direction components. Broadband frequency range of ground motion simulation is from 0.1 to 20 Hz. Through comparing observed parameters and synthetic parameters, we analyzed distribution characteristics of earthquake engineering parameters. From the comparison, the simulated waveform has high similarity with the observed waveform. We found the following. (1) Near-field PGA attenuates radically all around with strip radiation patterns in fault parallel while radiation patterns of fault normal is circular; PGV has a good similarity between observed record and synthetic record, but has different distribution characteristic in different components. (2) Rupture direction and terrain have a large influence on 90 % significant duration. (3) Arias Intensity is attenuating with increasing epicenter distance. Observed values have a high similarity with synthetic values. (4) Predominant period is very different in the part of Kyushu in fault normal. It is affected greatly by site conditions. (5) Most parameters have good reference values where the hypo-central is less than 35 km. (6) The GOF values of all these parameters are generally higher than 45 which means a good result according to Olsen's classification criterion. Not all parameters can fit well. Given these synthetic ground motion parameters, seismic hazard analysis can be performed and earthquake disaster analysis can be conducted in future urban planning.
2000 report on the value pricing pilot program
DOT National Transportation Integrated Search
1997-05-01
This document has been written to provide information on how to apply principles of geotechnical earthquake engineering to planning, design, and retrofit of highway facilities. Geotechnical earthquake engineering topics discussed in this document inc...
DOT National Transportation Integrated Search
1998-12-01
This manual was written to provide training on how to apply principles of geotechnical earthquake engineering to planning, design, and retrofit of highway facilities. Reproduced here are two chapters 4 and 8 in the settlement, respectively. These cha...
Research on response spectrum of dam based on scenario earthquake
NASA Astrophysics Data System (ADS)
Zhang, Xiaoliang; Zhang, Yushan
2017-10-01
Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.
GeoMO 2008--geotechnical earthquake engineering : site response.
DOT National Transportation Integrated Search
2008-10-01
The theme of GeoMO2008 has recently become of more interest to the Midwest civil engineering community due to the perceived earthquake risks and new code requirements. The constant seismic reminder for the New Madrid Seismic Zone and new USGS hazard ...
Towards Coupling of Macroseismic Intensity with Structural Damage Indicators
NASA Astrophysics Data System (ADS)
Kouteva, Mihaela; Boshnakov, Krasimir
2016-04-01
Knowledge on basic data of ground motion acceleration time histories during earthquakes is essential to understanding the earthquake resistant behaviour of structures. Peak and integral ground motion parameters such as peak ground motion values (acceleration, velocity and displacement), measures of the frequency content of ground motion, duration of strong shaking and various intensity measures play important roles in seismic evaluation of existing facilities and design of new systems. Macroseismic intensity is an earthquake measure related to seismic hazard and seismic risk description. Having detailed ideas on the correlations between the earthquake damage potential and macroseismic intensity is an important issue in engineering seismology and earthquake engineering. Reliable earthquake hazard estimation is the major prerequisite to successful disaster risk management. The usage of advanced earthquake engineering approaches for structural response modelling is essential for reliable evaluation of the accumulated damages in the existing buildings and structures due to the history of seismic actions, occurred during their lifetime. Full nonlinear analysis taking into account single event or series of earthquakes and the large set of elaborated damage indices are suitable contemporary tools to cope with this responsible task. This paper presents some results on the correlation between observational damage states, ground motion parameters and selected analytical damage indices. Damage indices are computed on the base of nonlinear time history analysis of test reinforced structure, characterising the building stock of the Mediterranean region designed according the earthquake resistant requirements in mid XX-th century.
Organizational changes at Earthquakes & Volcanoes
Gordon, David W.
1992-01-01
Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).
Optimal-adaptive filters for modelling spectral shape, site amplification, and source scaling
Safak, Erdal
1989-01-01
This paper introduces some applications of optimal filtering techniques to earthquake engineering by using the so-called ARMAX models. Three applications are presented: (a) spectral modelling of ground accelerations, (b) site amplification (i.e., the relationship between two records obtained at different sites during an earthquake), and (c) source scaling (i.e., the relationship between two records obtained at a site during two different earthquakes). A numerical example for each application is presented by using recorded ground motions. The results show that the optimal filtering techniques provide elegant solutions to above problems, and can be a useful tool in earthquake engineering.
Earthquake alarm; operating the seismograph station at the University of California, Berkeley.
Stump, B.
1980-01-01
At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point.
EU H2020 SERA: Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe
NASA Astrophysics Data System (ADS)
Giardini, Domenico; Saleh, Kauzar; SERA Consortium, the
2017-04-01
SERA - Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe - is a new infrastructure project awarded in the last Horizon 2020 call for Integrating Activities for Advanced Communities (INFRAIA-01-2016-2017). Building up on precursor projects like NERA, SHARE, NERIES, SERIES, etc., SERA is expected to contribute significantly to the access of data, services and research infrastructures, and to develop innovative solutions in seismology and earthquake engineering, with the overall objective of reducing the exposure to risks associated to natural and anthropogenic earthquakes. For instance, SERA will revise the European Seismic Hazard reference model for input in the current revision of the Eurocode 8 on Seismic Design of Buildings; we also foresee to develop the first comprehensive framework for seismic risk modeling at European scale, and to develop new standards for future experimental observations and instruments for earthquake engineering and seismology. To that aim, SERA is engaging 31 institutions across Europe with leading expertise in the operation of research facilities, monitoring infrastructures, data repositories and experimental facilities in the fields of seismology, anthropogenic hazards and earthquake engineering. SERA comprises 26 activities, including 5 Networking Activities (NA) to improve the availability and access of data through enhanced community coordination and pooling of resources, 6 Joint Research Activities (JRA) aimed at creating new European standards for the optimal use of the data collected by the European infrastructures, Virtual Access (VA) to the 5 main European services for seismology and engineering seismology, and Trans-national Access (TA) to 10 high-class experimental facilities for earthquake engineering and seismology in Europe. In fact, around 50% of the SERA resources will be dedicated to virtual and transnational access. SERA and EPOS (European Platform Observing System, a European Research Infrastructure Consortium for solid Earth services in Europe) will be developed in parallel, giving SERA the capacity to develop building blocks for EPOS in the areas of seismology, anthropogenic hazards and seismic engineering, such as new virtual access, new anthropogenic hazards products, expanded access to waveform data, etc. In addition, services developed and validated in SERA will be produced in a way that is compatible for integration in EPOS. This communication is aimed at informing the scientific community about the objectives and workplan of SERA, starting in spring 2017 for a duration of 3 years.
Sizing up earthquake damage: Differing points of view
Hough, S.; Bolen, A.
2007-01-01
When a catastrophic event strikes an urban area, many different professionals hit the ground running. Emergency responders respond, reporters report, and scientists and engineers collect and analyze data. Journalists and scientists may share interest in these events, but they have very different missions. To a journalist, earthquake damage is news. To a scientist or engineer, earthquake damage represents a valuable source of data that can help us understand how strongly the ground shook as well as how particular structures responded to the shaking.
NGA West 2 | Pacific Earthquake Engineering Research Center
, multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors
Holzer, Thomas L.
1998-01-01
This chapter contains two papers that summarize the performance of engineered earth structures, dams and stabilized excavations in soil, and two papers that characterize for engineering purposes the attenuation of ground motion with distance during the Loma Prieta earthquake. Documenting the field performance of engineered structures and confirming empirically based predictions of ground motion are critical for safe and cost effective seismic design of future structures as well as the retrofitting of existing ones.
Engineering uses of physics-based ground motion simulations
Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.
2014-01-01
This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.
1980-01-01
standard procedure for Analysis of all types of civil engineering struc- tures. Early in its development, it became apparent that this method had...unique potentialities in the evaluation of stress in dams, and many of its earliest civil engineering applications concerned special problems associated...with such structures [3,4]. The earliest dynamic finite element analyses of civil engineering structures involved the earthquake response analysis of
NASA Technical Reports Server (NTRS)
Scholl, R. E. (Editor)
1979-01-01
Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.
Transportations Systems Modeling and Applications in Earthquake Engineering
2010-07-01
49 Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g)............... 50...Memphis, Tennessee. The NMSZ was responsible for the devastating 1811-1812 New Madrid earthquakes , the largest earthquakes ever recorded in the...Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g) Table 1 Fragility parameters for MSC steel bridge (Padgett 2007
Introduction: seismology and earthquake engineering in Central and South America.
Espinosa, A.F.
1983-01-01
Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author
Initiatives to Reduce Earthquake Risk of Developing Countries
NASA Astrophysics Data System (ADS)
Tucker, B. E.
2008-12-01
The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of an earthquake- and tsunami-resistant structure in Sumatra to house a tsunami museum, a community training center, and offices of a local NGO that is preparing Padang for the next tsunami. This facility would be designed and built by a team of US and Indonesian academics, architects, engineers and students. Another initiative would launch a collaborative research program on school earthquake safety with the scientists and engineers from the US and the ten Islamic countries that comprise the Economic Cooperation Organization. Finally, GHI hopes to develop internet and satellite communication techniques that will allow earthquake risk managers in the US to interact with masons, government officials, engineers and architects in remote communities of vulnerable developing countries, closing the science and engineering divide.
Engineering aspects of seismological studies in Peru
Ocola, L.
1982-01-01
In retrospect, the Peruvian national long-range earthquake-study program began after the catastrophic earthquake of May 31, 1970. This earthquake triggered a large snow avalanche from Huascaran mountain, killing over 60,000 people, and covering with mud small cities and tens of villages in the Andean valley of Callejon de Huaylas, Huaraz. Since then, great efforts have been made to learn about the natural seismic environment and its engineering and social aspects. The Organization of American States (OAS)has been one of the most important agencies in the development of the program.
Earthquake: Game-based learning for 21st century STEM education
NASA Astrophysics Data System (ADS)
Perkins, Abigail Christine
To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having doubled the number of exhibited instances of critical thinking between games. Players in the first group exhibited about a third more instances of metacognition between games, while players in the second group doubled such instances. Between games, players in both groups more than doubled the number of exhibited instances of using earthquake engineering content knowledge. The student-players expanded use of scientific argumentation for all game-based learning checklist categories. With empirical evidence, I conclude play and learning can connect for successful 21 st century STEM education.
National Earthquake Hazards Reduction Program; time to expand
Steinbrugge, K.V.
1990-01-01
All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role?
The Road to Total Earthquake Safety
NASA Astrophysics Data System (ADS)
Frohlich, Cliff
Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.
Procedures for Computing Site Seismicity
1994-02-01
Fourth World Conference on Earthquake Engineering, Santiago, Chile , 1969. Schnabel, P.B., J. Lysmer, and H.B. Seed (1972). SHAKE, a computer program for...This fault system is composed of the Elsinore and Whittier fault zones, Agua Caliente fault, and Earthquake Valley fault. Five recent earthquakes of
10 CFR 100.23 - Geologic and seismic siting criteria.
Code of Federal Regulations, 2013 CFR
2013-01-01
... Earthquake Ground Motion, and to permit adequate engineering solutions to actual or potential geologic and..., earthquake recurrence rates, fault geometry and slip rates, site foundation material, and seismically induced... Earthquake Ground Motion for the site, the potential for surface tectonic and nontectonic deformations, the...
10 CFR 100.23 - Geologic and seismic siting criteria.
Code of Federal Regulations, 2014 CFR
2014-01-01
... Earthquake Ground Motion, and to permit adequate engineering solutions to actual or potential geologic and..., earthquake recurrence rates, fault geometry and slip rates, site foundation material, and seismically induced... Earthquake Ground Motion for the site, the potential for surface tectonic and nontectonic deformations, the...
10 CFR 100.23 - Geologic and seismic siting criteria.
Code of Federal Regulations, 2012 CFR
2012-01-01
... Earthquake Ground Motion, and to permit adequate engineering solutions to actual or potential geologic and..., earthquake recurrence rates, fault geometry and slip rates, site foundation material, and seismically induced... Earthquake Ground Motion for the site, the potential for surface tectonic and nontectonic deformations, the...
Akkar, Sinan; Aldemir, A.; Askan, A.; Bakir, S.; Canbay, E.; Demirel, I.O.; Erberik, M.A.; Gulerce, Z.; Gulkan, Polat; Kalkan, Erol; Prakash, S.; Sandikkaya, M.A.; Sevilgen, V.; Ugurhan, B.; Yenier, E.
2011-01-01
An earthquake of MW = 6.1 occurred in the Elazığ region of eastern Turkey on 8 March 2010 at 02:32:34 UTC. The United States Geological Survey (USGS) reported the epicenter of the earthquake as 38.873°N-39.981°E with a focal depth of 12 km. Forty-two people lost their lives and 137 were injured during the event. The earthquake was reported to be on the left-lateral strike-slip east Anatolian fault (EAF), which is one of the two major active fault systems in Turkey. Teams from the Earthquake Engineering Research Center of the Middle East Technical University (EERC-METU) visited the earthquake area in the aftermath of the mainshock. Their reconnaissance observations were combined with interpretations of recorded ground motions for completeness. This article summarizes observations on building and ground damage in the area and provides a discussion of the recorded motions. No significant observations in terms of geotechnical engineering were made.
PEER - National Information Service for Earthquake Engineering - NISEE
Information Service for Earthquake Engineering - NISEE The NISEE/PEER Library is an affiliated library of the Field Station, five miles from the main Berkeley campus. Address NISEE/PEER Library, University of Regents Hours Monday - Friday, 9:00 am - 1:00 pm Open to the public NISEE/PEER Library home page. 325
Turkish Compulsory Earthquake Insurance (TCIP)
NASA Astrophysics Data System (ADS)
Erdik, M.; Durukal, E.; Sesetyan, K.
2009-04-01
Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.
Designing an Earthquake-Proof Art Museum: An Arts- and Engineering-Integrated Science Lesson
ERIC Educational Resources Information Center
Carignan, Anastasia; Hussain, Mahjabeen
2016-01-01
In this practical arts-integrated science and engineering lesson, an inquiry-based approach was adopted to teach a class of fourth graders in a Midwest elementary school about the scientific concepts of plate tectonics and earthquakes. Lessons were prepared following the 5 E instructional model. Next Generation Science Standards (4-ESS3-2) and the…
Research in seismology and earthquake engineering in Venezuela
Urbina, L.; Grases, J.
1983-01-01
After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article.
ERIC Educational Resources Information Center
Chang, Pei-Fen; Wang, Dau-Chung
2011-01-01
In May 2008, the worst earthquake in more than three decades struck southwest China, killing more than 80,000 people. The complexity of this earthquake makes it an ideal case study to clarify the intertwined issues of ethics in engineering and to help cultivate critical thinking skills. This paper first explores the need to encourage engineering…
PRISM software—Processing and review interface for strong-motion data
Jones, Jeanne M.; Kalkan, Erol; Stephens, Christopher D.; Ng, Peter
2017-11-28
Rapidly available and accurate ground-motion acceleration time series (seismic recordings) and derived data products are essential to quickly providing scientific and engineering analysis and advice after an earthquake. To meet this need, the U.S. Geological Survey National Strong Motion Project has developed a software package called PRISM (Processing and Review Interface for Strong-Motion data). PRISM automatically processes strong-motion acceleration records, producing compatible acceleration, velocity, and displacement time series; acceleration, velocity, and displacement response spectra; Fourier amplitude spectra; and standard earthquake-intensity measures. PRISM is intended to be used by strong-motion seismic networks, as well as by earthquake engineers and seismologists.
The California Integrated Seismic Network
NASA Astrophysics Data System (ADS)
Hellweg, M.; Given, D.; Hauksson, E.; Neuhauser, D.; Oppenheimer, D.; Shakal, A.
2007-05-01
The mission of the California Integrated Seismic Network (CISN) is to operate a reliable, modern system to monitor earthquakes throughout the state; to generate and distribute information in real-time for emergency response, for the benefit of public safety, and for loss mitigation; and to collect and archive data for seismological and earthquake engineering research. To meet these needs, the CISN operates data processing and archiving centers, as well as more than 3000 seismic stations. Furthermore, the CISN is actively developing and enhancing its infrastructure, including its automated processing and archival systems. The CISN integrates seismic and strong motion networks operated by the University of California Berkeley (UCB), the California Institute of Technology (Caltech), and the United States Geological Survey (USGS) offices in Menlo Park and Pasadena, as well as the USGS National Strong Motion Program (NSMP), and the California Geological Survey (CGS). The CISN operates two earthquake management centers (the NCEMC and SCEMC) where statewide, real-time earthquake monitoring takes place, and an engineering data center (EDC) for processing strong motion data and making it available in near real-time to the engineering community. These centers employ redundant hardware to minimize disruptions to the earthquake detection and processing systems. At the same time, dual feeds of data from a subset of broadband and strong motion stations are telemetered in real- time directly to both the NCEMC and the SCEMC to ensure the availability of statewide data in the event of a catastrophic failure at one of these two centers. The CISN uses a backbone T1 ring (with automatic backup over the internet) to interconnect the centers and the California Office of Emergency Services. The T1 ring enables real-time exchange of selected waveforms, derived ground motion data, phase arrivals, earthquake parameters, and ShakeMaps. With the goal of operating similar and redundant statewide earthquake processing systems at both real-time EMCs, the CISN is currently adopting and enhancing the database-centric, earthquake processing and analysis software originally developed for the Caltech/USGS Pasadena TriNet project. Earthquake data and waveforms are made available to researchers and to the public in near real-time through the CISN's Northern and Southern California Eathquake Data Centers (NCEDC and SCEDC) and through the USGS Earthquake Notification System (ENS). The CISN partners have developed procedures to automatically exchange strong motion data, both waveforms and peak parameters, for use in ShakeMap and in the rapid engineering reports which are available near real-time through the strong motion EDC.
Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program
NASA Astrophysics Data System (ADS)
Benthien, M. L.
2003-12-01
The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and implementation.
The Alaska earthquake, March 27, 1964: lessons and conclusions
Eckel, Edwin B.
1970-01-01
One of the greatest earthquakes of all time struck south-central Alaska on March 27, 1964. Strong motion lasted longer than for most recorded earthquakes, and more land surface was dislocated, vertically and horizontally, than by any known previous temblor. Never before were so many effects on earth processes and on the works of man available for study by scientists and engineers over so great an area. The seismic vibrations, which directly or indirectly caused most of the damage, were but surface manifestations of a great geologic event-the dislocation of a huge segment of the crust along a deeply buried fault whose nature and even exact location are still subjects for speculation. Not only was the land surface tilted by the great tectonic event beneath it, with resultant seismic sea waves that traversed the entire Pacific, but an enormous mass of land and sea floor moved several tens of feet horizontally toward the Gulf of Alaska. Downslope mass movements of rock, earth, and snow were initiated. Subaqueous slides along lake shores and seacoasts, near-horizontal movements of mobilized soil (“landspreading”), and giant translatory slides in sensitive clay did the most damage and provided the most new knowledge as to the origin, mechanics, and possible means of control or avoidance of such movements. The slopes of most of the deltas that slid in 1964, and that produced destructive local waves, are still as steep or steeper than they were before the earthquake and hence would be unstable or metastable in the event of another great earthquake. Rockslide avalanches provided new evidence that such masses may travel on cushions of compressed air, but a widely held theory that glaciers surge after an earthquake has not been substantiated. Innumerable ground fissures, many of them marked by copious emissions of water, caused much damage in towns and along transportation routes. Vibration also consolidated loose granular materials. In some coastal areas, local subsidence was superimposed on regional tectonic subsidence to heighten the flooding damage. Ground and surface waters were measurably affected by the earthquake, not only in Alaska but throughout the world. Expectably, local geologic conditions largely controlled the extent of structural damage, whether caused directly by seismic vibrations or by secondary effects such as those just described. Intensity was greatest in areas underlain by thick saturated unconsolidated deposits, least on indurated bedrock or permanently frozen ground, and intermediate on coarse well-drained gravel, on morainal deposits, or on moderately indurated sedimentary rocks. Local and even regional geology also controlled the distribution and extent of the earthquake's effects on hydrologic systems. In the conterminous United States, for example, seiches in wells and bodies of surface water were controlled by geologic structures of regional dimension. Devastating as the earthquake was, it had many long-term beneficial effects. Many of these were socioeconomic or engineering in nature; others were of scientific value. Much new and corroborative basic geologic and hydrologic information was accumulated in the course of the earthquake studies, and many new or improved investigative techniques were developed. Chief among these, perhaps, were the recognition that lakes can be used as giant tiltmeters, the refinement of methods for measuring land-level changes by observing displacements of barnacles and other sessile organisms, and the relating of hydrology to seismology by worldwide study of hydroseisms in surface-water bodies and in wells. The geologic and hydrologic lessons learned from studies of the Alaska earthquake also lead directly to better definition of the research needed to further our understanding of earthquakes and of how to avoid or lessen the effects of future ones. Research is needed on the origins and mechanisms of earthquakes, on crustal structure, and on the generation of tsunamis and local waves. Better earthquake-hazard maps, based on improved knowledge of regional geology, fault behavior, and earthquake mechanisms, are needed for the entire country. Their preparation will require the close collaboration of engineers, seismologists, and geologists. Geologic maps of all inhabited places in earthquake-prone parts of the country are also needed by city planners and others, because the direct relationship between local geology and potential earthquake damage is now well understood. Improved and enlarged nets of earthquake-sensing instruments, sited in relation to known geology, are needed, as are many more geodetic and hydrographic measurements. Every large earthquake, wherever located, should be regarded as a full-scale laboratory experiment whose study can give scientific and engineering information unobtainable from any other source. Plans must be made before the event to insure staffing, funding, and coordination of effort for the scientific and engineering study of future earthquakes. Advice of earth scientists and engineers should be used in the decision-making processes involved in reconstruction after any future disastrous earthquake, as was done after the Alaska earthquake. The volume closes with a selected bibliography and a comprehensive index to the entire series of U.S. Geological Survey Professional Papers 541-546. This is the last in a series of six reports that the U.S. Geological Survey published on the results of a comprehensive geologic study that began, as a reconnaissance survey, within 24 hours after the March 27, 1964, Magnitude 9.2 Great Alaska Earthquake and extended, as detailed investigations, through several field seasons. The 1964 Great Alaska earthquake was the largest earthquake in the U.S. since 1700. Professional Paper 546, in 1 part, describes Lessons and Conclusions.
NASA Astrophysics Data System (ADS)
2002-09-01
Contents include the following: Deep Electromagnetic Images of Seismogenic Zone of the Chi-Chi (Taiwan) Earthquake; New Techniques for Stress-Forecasting Earthquakes; Aspects of Characteristics of Near-Fault Ground Motions of the 1999 Chi-Chi (Taiwan) Earthquake; Liquefaction Damage and Related Remediation in Wufeng after the Chi-Chi Earthquake; Fines Content Effects on Liquefaction Potential Evaluation for Sites Liquefied during Chi-Chi Earthquake 1999; Damage Investigation and Liquefaction Potential Analysis of Gravelly Soil; Dynamic Characteristics of Soils in Yuan-Lin Liquefaction Area; A Preliminary Study of Earthquake Building Damage and Life Loss Due to the Chi-Chi Earthquake; Statistical Analyses of Relation between Mortality and Building Type in the 1999 Chi-Chi Earthquake; Development of an After Earthquake Disaster Shelter Evaluation Model; Posttraumatic Stress Reactions in Children and Adolescents One Year after the 1999 Taiwan Chi-Chi Earthquake; Changes or Not is the Question: the Meaning of Posttraumatic Stress Reactions One Year after the Taiwan Chi-Chi Earthquake.
An interview with Karl Steinbrugge
Spall, H.
1985-01-01
He has served on numerous national and international committees on earthquake hazards, and he is now a consulting structural engineer, specializing in earthquake hazard evaluation. At the present moment he is chairman of an independent panel of the Federal Emergency Management Agency that is reviewing the National Earthquake Hazards Reduction Program. Henry Spall recently asked Steinbrugge some questions about his long career.
Earthquakes in Mississippi and vicinity 1811-2010
Dart, Richard L.; Bograd, Michael B.E.
2011-01-01
This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.
NASA Astrophysics Data System (ADS)
Filiatrault, Andre; Sullivan, Timothy
2014-08-01
With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major knowledge gaps that will need to be filled by future research. Furthermore, considering recent trends in earthquake engineering, the paper explores how performance-based seismic design might be conceived for nonstructural components, drawing on recent developments made in the field of seismic design and hinting at the specific considerations required for nonstructural components.
,
1999-01-01
This report assesses the status, needs, and associated costs of seismic monitoring in the United States. It sets down the requirement for an effective, national seismic monitoring strategy and an advanced system linking national, regional, and urban monitoring networks. Modernized seismic monitoring can provide alerts of imminent strong earthquake shaking; rapid assessment of distribution and severity of earthquake shaking (for use in emergency response); warnings of a possible tsunami from an offshore earthquake; warnings of volcanic eruptions; information for correctly characterizing earthquake hazards and for improving building codes; and data on response of buildings and structures during earthquakes, for safe, cost-effective design, engineering, and construction practices in earthquake-prone regions.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofmann, R.B.
1995-09-01
Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less
Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
METCALF, I.L.
1999-12-06
This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces.
Wong, I.; Olig, S.; Dober, M.; Silva, W.; Wright, D.; Thomas, P.; Gregor, N.; Sanford, A.; Lin, K.-W.; Love, D.
2004-01-01
These maps are not intended to be a substitute for site-specific studies for engineering design nor to replace standard maps commonly referenced in building codes. Rather, we hope that these maps will be used as a guide by government agencies; the engineering, urban planning, emergency preparedness, and response communities; and the general public as part of an overall program to reduce earthquake risk and losses in New Mexico.
Sequential Analysis: Hypothesis Testing and Changepoint Detection
2014-07-11
it is necessary to estimate in situ the geographical coordinates and other parameters of earthquakes . The standard sensor equipment of a three...components. When an earthquake arises, the sensors begin to record several types of seismic waves (body and surface waves), among which the more important...machines and to increased safety norms. Many structures to be monitored, e.g., civil engineering structures subject to wind and earthquakes , aircraft
Earthquakes in Arkansas and vicinity 1699-2010
Dart, Richard L.; Ausbrooks, Scott M.
2011-01-01
This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.
Using the USGS Seismic Risk Web Application to estimate aftershock damage
McGowan, Sean M.; Luco, Nicolas
2014-01-01
The U.S. Geological Survey (USGS) Engineering Risk Assessment Project has developed the Seismic Risk Web Application to combine earthquake hazard and structural fragility information in order to calculate the risk of earthquake damage to structures. Enabling users to incorporate their own hazard and fragility information into the calculations will make it possible to quantify (in near real-time) the risk of additional damage to structures caused by aftershocks following significant earthquakes. Results can quickly be shared with stakeholders to illustrate the impact of elevated ground motion hazard and earthquake-compromised structural integrity on the risk of damage during a short-term, post-earthquake time horizon.
NASA Astrophysics Data System (ADS)
Chang, Pei-Fen; Wang, Dau-Chung
2011-08-01
In May 2008, the worst earthquake in more than three decades struck southwest China, killing more than 80,000 people. The complexity of this earthquake makes it an ideal case study to clarify the intertwined issues of ethics in engineering and to help cultivate critical thinking skills. This paper first explores the need to encourage engineering ethics within a cross-cultural context. Next, it presents a systematic model for designing an engineering ethics curriculum based on moral development theory and ethic dilemma analysis. Quantitative and qualitative data from students' oral and written work were collected and analysed to determine directions for improvement. The paper also presents results of an assessment of this interdisciplinary engineering ethics course. This investigation of a disaster is limited strictly to engineering ethics education; it is not intended to assign blame, but rather to spark debate about ethical issues.
Geotechnical effects of the 2015 magnitude 7.8 Gorkha, Nepal, earthquake and aftershocks
Moss, Robb E. S.; Thompson, Eric M.; Kieffer, D Scott; Tiwari, Binod; Hashash, Youssef M A; Acharya, Indra; Adhikari, Basanta; Asimaki, Domniki; Clahan, Kevin B.; Collins, Brian D.; Dahal, Sachindra; Jibson, Randall W.; Khadka, Diwakar; Macdonald, Amy; Madugo, Chris L M; Mason, H Benjamin; Pehlivan, Menzer; Rayamajhi, Deepak; Uprety, Sital
2015-01-01
This article summarizes the geotechnical effects of the 25 April 2015 M 7.8 Gorkha, Nepal, earthquake and aftershocks, as documented by a reconnaissance team that undertook a broad engineering and scientific assessment of the damage and collected perishable data for future analysis. Brief descriptions are provided of ground shaking, surface fault rupture, landsliding, soil failure, and infrastructure performance. The goal of this reconnaissance effort, led by Geotechnical Extreme Events Reconnaissance, is to learn from earthquakes and mitigate hazards in future earthquakes.
1987-09-01
Geological Survey, MS977, Menlo Park , CA 94025, USA. , TURKISH NATIONAL COMMITTEE FOR EARTHQUAKE ENGINEERING THIRTEENTH REGIONAL SEMINALR ON EARTQUAKE...this case the conditional probability P(E/F1) will also depend in general on t . A simple example of a case of this type was developed by the present...These studies took Into cosideration all the available date eoncerning the dynamic characteristics of different type * of buildings. A first attempt was
33 CFR 222.4 - Reporting earthquake effects.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... structural integrity and operational adequacy of major Civil Works structures following the occurrence of...) Applicability. This regulation is applicable to all field operating agencies having Civil Works responsibilities...
DOT National Transportation Integrated Search
2008-12-01
Shortly after the 1994 Northridge Earthquake, Caltrans geotechnical engineers charged with developing site-specific : response spectra for high priority California bridges initiated a research project aimed at broadening their perspective : from simp...
Estimating the residual axial load capacity of flexure-dominated reinforced concrete bridge columns.
DOT National Transportation Integrated Search
2014-08-01
Extreme events such as earthquakes have the potential to damage hundreds, if not thousands, of bridges on a : transportation network. Following an earthquake, the damaged bridges are inspected by engineers sequentially to : decide whether or not to c...
NASA Astrophysics Data System (ADS)
Mualchin, Lalliana
2011-03-01
Modern earthquake ground motion hazard mapping in California began following the 1971 San Fernando earthquake in the Los Angeles metropolitan area of southern California. Earthquake hazard assessment followed a traditional approach, later called Deterministic Seismic Hazard Analysis (DSHA) in order to distinguish it from the newer Probabilistic Seismic Hazard Analysis (PSHA). In DSHA, seismic hazard in the event of the Maximum Credible Earthquake (MCE) magnitude from each of the known seismogenic faults within and near the state are assessed. The likely occurrence of the MCE has been assumed qualitatively by using late Quaternary and younger faults that are presumed to be seismogenic, but not when or within what time intervals MCE may occur. MCE is the largest or upper-bound potential earthquake in moment magnitude, and it supersedes and automatically considers all other possible earthquakes on that fault. That moment magnitude is used for estimating ground motions by applying it to empirical attenuation relationships, and for calculating ground motions as in neo-DSHA (Z uccolo et al., 2008). The first deterministic California earthquake hazard map was published in 1974 by the California Division of Mines and Geology (CDMG) which has been called the California Geological Survey (CGS) since 2002, using the best available fault information and ground motion attenuation relationships at that time. The California Department of Transportation (Caltrans) later assumed responsibility for printing the refined and updated peak acceleration contour maps which were heavily utilized by geologists, seismologists, and engineers for many years. Some engineers involved in the siting process of large important projects, for example, dams and nuclear power plants, continued to challenge the map(s). The second edition map was completed in 1985 incorporating more faults, improving MCE's estimation method, and using new ground motion attenuation relationships from the latest published results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.
Evaluation of seismic performance of reinforced concrete (RC) buildings under near-field earthquakes
NASA Astrophysics Data System (ADS)
Moniri, Hassan
2017-03-01
Near-field ground motions are significantly severely affected on seismic response of structure compared with far-field ground motions, and the reason is that the near-source forward directivity ground motions contain pulse-long periods. Therefore, the cumulative effects of far-fault records are minor. The damage and collapse of engineering structures observed in the last decades' earthquakes show the potential of damage in existing structures under near-field ground motions. One important subject studied by earthquake engineers as part of a performance-based approach is the determination of demand and collapse capacity under near-field earthquake. Different methods for evaluating seismic structural performance have been suggested along with and as part of the development of performance-based earthquake engineering. This study investigated the results of illustrious characteristics of near-fault ground motions on the seismic response of reinforced concrete (RC) structures, by the use of Incremental Nonlinear Dynamic Analysis (IDA) method. Due to the fact that various ground motions result in different intensity-versus-response plots, this analysis is done again under various ground motions in order to achieve significant statistical averages. The OpenSees software was used to conduct nonlinear structural evaluations. Numerical modelling showed that near-source outcomes cause most of the seismic energy from the rupture to arrive in a single coherent long-period pulse of motion and permanent ground displacements. Finally, a vulnerability of RC building can be evaluated against pulse-like near-fault ground motions effects.
Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale
NASA Astrophysics Data System (ADS)
Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.
2016-05-01
The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.
Real-Time Earthquake Monitoring with Spatio-Temporal Fields
NASA Astrophysics Data System (ADS)
Whittier, J. C.; Nittel, S.; Subasinghe, I.
2017-10-01
With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.
coordinates research in support of the PEER mission in performance-based earthquake engineering. The broad system dynamic response; assessment of the performance of the structural and nonstructural systems ; consequences in terms of casualties, capital costs, and post-earthquake functionality; and decision-making to
Harp, E.L.; Noble, M.A.
1993-01-01
Investigations of earthquakes world wide show that rock falls are the most abundant type of landslide that is triggered by earthquakes. An engineering classification originally used in tunnel design, known as the rock mass quality designation (Q), was modified for use in rating the susceptibility of rock slopes to seismically-induced failure. Analysis of rock-fall concentrations and Q-values for the 1980 earthquake sequence near Mammoth Lakes, California, defines a well-constrained upper bound that shows the number of rock falls per site decreases rapidly with increasing Q. Because of the similarities of lithology and slope between the Eastern Sierra Nevada Range near Mammoth Lakes and the Wasatch Front near Salt Lake City, Utah, the probabilities derived from analysis of the Mammoth Lakes region were used to predict rock-fall probabilities for rock slopes near Salt Lake City in response to a magnitude 6.0 earthquake. These predicted probabilities were then used to generalize zones of rock-fall susceptibility. -from Authors
Celebi, M.
2004-01-01
The recorded responses of an Anchorage, Alaska, building during four significant earthquakes that occurred in 2002 are studied. Two earthquakes, including the 3 November 2002 M7.9 Denali fault earthquake, with epicenters approximately 275 km from the building, generated long trains of long-period (>1 s) surface waves. The other two smaller earthquakes occurred at subcrustal depths practically beneath Anchorage and produced higher frequency motions. These two pairs of earthquakes have different impacts on the response of the building. Higher modes are more pronounced in the building response during the smaller nearby events. The building responses indicate that the close-coupling of translational and torsional modes causes a significant beating effect. It is also possible that there is some resonance occurring due to the site frequency being close to the structural frequency. Identification of dynamic characteristics and behavior of buildings can provide important lessons for future earthquake-resistant designs and retrofit of existing buildings. ?? 2004, Earthquake Engineering Research Institute.
Reconnaissance Report, Section 205 Chattooga River Trion, Georgia, Chattooga County
1991-07-01
magnitude, mb, of 7.5, at a distance of about 118 km, in the New Madrid source zone. The earthquake motions estimated to occur at Barkley from an...4: Liquefaction Susceptibility Evaluation and Post- Earthquake Strength Determination Volume 5: Stability Evaluation of Geotechnical Structures The...contributions from ORN. Mssrs. Ronald E. Wahl of Soil and Rock Mechanics Division, Richard S. Olsen, and Dr. M. E. Hynes of the Earthquake Engineering and
Technical guidelines for the implementation of the Advanced National Seismic System
Committee, ANSS Technical Integration
2002-01-01
The Advanced National Seismic System (ANSS) is a major national initiative led by the US Geological Survey that serves the needs of the earthquake monitoring, engineering, and research communities as well as national, state, and local governments, emergency response organizations, and the general public. Legislation authorizing the ANSS was passed in 2000, and low levels of funding for planning and initial purchases of new seismic instrumentation have been appropriated beginning in FY2000. When fully operational, the ANSS will be an advanced monitoring system (modern digital seismographs and accelerographs, communications networks, data collection and processing centers, and well-trained personnel) distributed across the United States that operates with high performance standards, gathers critical technical data, and effectively provides timely and reliable earthquake products, information, and services to meet the Nation’s needs. The ANSS will automatically broadcast timely and authoritative products describing the occurrence of earthquakes, earthquake source properties, the distribution of ground shaking, and, where feasible, broadcast early warnings and alerts for the onset of strong ground shaking. Most importantly, the ANSS will provide earthquake data, derived products, and information to the public, emergency responders, officials, engineers, educators, researchers, and other ANSS partners rapidly and in forms that are useful for their needs.
NASA Astrophysics Data System (ADS)
Karimzadeh, Shaghayegh; Askan, Aysegul; Yakut, Ahmet
2017-09-01
Simulated ground motions can be used in structural and earthquake engineering practice as an alternative to or to augment the real ground motion data sets. Common engineering applications of simulated motions are linear and nonlinear time history analyses of building structures, where full acceleration records are necessary. Before using simulated ground motions in such applications, it is important to assess those in terms of their frequency and amplitude content as well as their match with the corresponding real records. In this study, a framework is outlined for assessment of simulated ground motions in terms of their use in structural engineering. Misfit criteria are determined for both ground motion parameters and structural response by comparing the simulated values against the corresponding real values. For this purpose, as a case study, the 12 November 1999 Duzce earthquake is simulated using stochastic finite-fault methodology. Simulated records are employed for time history analyses of frame models of typical residential buildings. Next, the relationships between ground motion misfits and structural response misfits are studied. Results show that the seismological misfits around the fundamental period of selected buildings determine the accuracy of the simulated responses in terms of their agreement with the observed responses.
ERIC Educational Resources Information Center
Baytiyeh, Hoda; Naja, Mohamad K.
2014-01-01
Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enroll on engineering courses through lenient…
The 1906 earthquake and a century of progress in understanding earthquakes and their hazards
Zoback, M.L.
2006-01-01
The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.
Seismicity of the Earth 1900-2007
Tarr, Arthur C.; Villaseñor, Antonio; Furlong, Kevin P.; Rhea, Susan; Benz, Harley M.
2010-01-01
This map illustrates more than one century of global seismicity in the context of global plate tectonics and the Earth's physiography. Primarily designed for use by earth scientists and engineers interested in earthquake hazards of the 20th and early 21st centuries, this map provides a comprehensive overview of strong earthquakes since 1900. The map clearly identifies the location of the 'great' earthquakes (M8.0 and larger) and the rupture area, if known, of the M8.3 or larger earthquakes. The earthquake symbols are scaled proportional to the moment magnitude and therefore to the area of faulting, thus providing a better understanding of the relative sizes and distribution of earthquakes in the magnitude range 5.5 to 9.5. Plotting the known rupture area of the largest earthquakes also provides a better appreciation of the extent of some of the most famous and damaging earthquakes in modern history. All earthquakes shown on the map were carefully relocated using a standard earth reference model and standardized location procedures, thereby eliminating gross errors and biases in locations of historically important earthquakes that are often found in numerous seismicity catalogs.
Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach
Jaiswal, Kishor; Wald, David J.; Hearne, Mike
2009-01-01
We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.
NASA Astrophysics Data System (ADS)
Kaneda, Y.; Takahashi, N.; Hori, T.; Kawaguchi, K.; Isouchi, C.; Fujisawa, K.
2017-12-01
Destructive natural disasters such as earthquakes and tsunamis have occurred frequently in the world. For instance, 2004 Sumatra Earthquake in Indonesia, 2008 Wenchuan Earthquake in China, 2010 Chile Earthquake and 2011 Tohoku Earthquake in Japan etc., these earthquakes generated very severe damages. For the reduction and mitigation of damages by destructive natural disasters, early detection of natural disasters and speedy and proper evacuations are indispensable. And hardware and software developments/preparations for reduction and mitigation of natural disasters are quite important. In Japan, DONET as the real time monitoring system on the ocean floor is developed and deployed around the Nankai trough seismogenic zone southwestern Japan. So, the early detection of earthquakes and tsunamis around the Nankai trough seismogenic zone will be expected by DONET. The integration of the real time data and advanced simulation researches will lead to reduce damages, however, in the resilience society, the resilience methods will be required after disasters. Actually, methods on restorations and revivals are necessary after natural disasters. We would like to propose natural disaster mitigation science for early detections, evacuations and restorations against destructive natural disasters. This means the resilience society. In natural disaster mitigation science, there are lots of research fields such as natural science, engineering, medical treatment, social science and literature/art etc. Especially, natural science, engineering and medical treatment are fundamental research fields for natural disaster mitigation, but social sciences such as sociology, geography and psychology etc. are very important research fields for restorations after natural disasters. Finally, to realize and progress disaster mitigation science, human resource cultivation is indispensable. We already carried out disaster mitigation science under `new disaster mitigation research project on Mega thrust earthquakes around Nankai/Ryukyu subduction zone', and `SATREPS project of earthquake and tsunami disaster mitigation in the Marmara region and disaster education in Turkey'. Furthermore, we have to progress the natural disaster mitigation science against destructive natural disaster in the near future.
Engineering applications of strong ground motion simulation
NASA Astrophysics Data System (ADS)
Somerville, Paul
1993-02-01
The formulation, validation and application of a procedure for simulating strong ground motions for use in engineering practice are described. The procedure uses empirical source functions (derived from near-source strong motion recordings of small earthquakes) to provide a realistic representation of effects such as source radiation that are difficult to model at high frequencies due to their partly stochastic behavior. Wave propagation effects are modeled using simplified Green's functions that are designed to transfer empirical source functions from their recording sites to those required for use in simulations at a specific site. The procedure has been validated against strong motion recordings of both crustal and subduction earthquakes. For the validation process we choose earthquakes whose source models (including a spatially heterogeneous distribution of the slip of the fault) are independently known and which have abundant strong motion recordings. A quantitative measurement of the fit between the simulated and recorded motion in this validation process is used to estimate the modeling and random uncertainty associated with the simulation procedure. This modeling and random uncertainty is one part of the overall uncertainty in estimates of ground motions of future earthquakes at a specific site derived using the simulation procedure. The other contribution to uncertainty is that due to uncertainty in the source parameters of future earthquakes that affect the site, which is estimated from a suite of simulations generated by varying the source parameters over their ranges of uncertainty. In this paper, we describe the validation of the simulation procedure for crustal earthquakes against strong motion recordings of the 1989 Loma Prieta, California, earthquake, and for subduction earthquakes against the 1985 Michoacán, Mexico, and Valparaiso, Chile, earthquakes. We then show examples of the application of the simulation procedure to the estimatation of the design response spectra for crustal earthquakes at a power plant site in California and for subduction earthquakes in the Seattle-Portland region. We also demonstrate the use of simulation methods for modeling the attenuation of strong ground motion, and show evidence of the effect of critical reflections from the lower crust in causing the observed flattening of the attenuation of strong ground motion from the 1988 Saguenay, Quebec, and 1989 Loma Prieta earthquakes.
ERIC Educational Resources Information Center
Haddad, David Elias
2014-01-01
Earth's topographic surface forms an interface across which the geodynamic and geomorphic engines interact. This interaction is best observed along crustal margins where topography is created by active faulting and sculpted by geomorphic processes. Crustal deformation manifests as earthquakes at centennial to millennial timescales. Given that…
NASA Astrophysics Data System (ADS)
Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai
2018-01-01
Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.
NASA Astrophysics Data System (ADS)
Xu, Peibin; Wen, Ruizhi; Wang, Hongwei; Ji, Kun; Ren, Yefei
2015-02-01
The Ludian County of Yunnan Province in southwestern China was struck by an M S6.5 earthquake on August 3, 2014, which was another destructive event following the M S8.0 Wenchuan earthquake in 2008, M S7.1 Yushu earthquake in 2010, and M S7.0 Lushan earthquake in 2013. National Strong-Motion Observation Network System of China collected 74 strong motion recordings, which the maximum peak ground acceleration recorded by the 053LLT station in Longtoushan Town was 949 cm/s2 in E-W component. The observed PGAs and spectral ordinates were compared with ground-motion prediction equation in China and the NGA-West2 developed by Pacific Earthquake Engineering Researcher Center. This earthquake is considered as the first case for testing applicability of NGA-West2 in China. Results indicate that the observed PGAs and the 5 % damped pseudo-response spectral accelerations are significantly lower than the predicted ones. The field survey around some typical strong motion stations verified that the earthquake damage was consistent with the official isoseismal by China Earthquake Administration.
NASA Astrophysics Data System (ADS)
Wu, Stephen
Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that can capture the uncertainties in EEW information and the decision process is used. This approach is called the Performance-Based Earthquake Early Warning, which is based on the PEER Performance-Based Earthquake Engineering method. Use of surrogate models is suggested to improve computational efficiency. Also, new models are proposed to add the influence of lead time into the cost-benefit analysis. For example, a value of information model is used to quantify the potential value of delaying the activation of a mitigation action for a possible reduction of the uncertainty of EEW information in the next update. Two practical examples, evacuation alert and elevator control, are studied to illustrate the ePAD framework. Potential advanced EEW applications, such as the case of multiple-action decisions and the synergy of EEW and structural health monitoring systems, are also discussed.
Strong motion seismology in Mexico
NASA Astrophysics Data System (ADS)
Singh, S. K.; Ordaz, M.
1993-02-01
Since 1985, digital accelerographs have been installed along a 500 km segment above the Mexican subduction zone, at some inland sites which form an attenuation line between the Guerrero seismic gap and Mexico City, and in the Valley of Mexico. These networks have recorded a few large earthquakes and many moderate and small earthquakes. Analysis of the data has permitted a significant advance in the understanding of source characteristics, wave propagation and attenuation, and site effects. This, in turn, has permitted reliable estimations of ground motions from future earthquakes. This paper presents a brief summary of some important results which are having a direct bearing on current earthquake engineering practice in Mexico.
Earthquake Risk Mitigation in the Tokyo Metropolitan area
NASA Astrophysics Data System (ADS)
Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.
2010-12-01
Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and scientific results obtained so far at the Earthquake Research Institute (ERI). ERI hosts the scientific part focusing on characterization of the plate structure and source faults in and around the Tokyo metropolitan area. One of the topics is ongoing deployment of seismic stations that constitute the Metropolitan Seismic Observation network (MeSO-net). We have deployed 226 stations with a 2-5 km interval in space. Based on seismic data obtained from the MeSO-net, we aim to reveal the detailed geometry of the subducting PSP.
NASA Astrophysics Data System (ADS)
Shanker, D.; Paudyal, ,; Singh, H.
2010-12-01
It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were characterized by an extremely high annual earthquake frequency as compared to the preceding normal and the following gap episodes, and is the characteristics of the events in such an episode is causally related with the magnitude and the time of occurrence of the forthcoming earthquake. It is observed here that for the shorter duration of the preparatory time period, there will be the smaller mainshock, and vice-versa. The Western Nepal and the adjoining Tibet region are potential for the future medium size earthquakes. Accordingly, it has been estimated here that an earthquake with M 6.5 ± 0.5 may occur at any time from now onwards till December 2011 in the Western Nepal within an area bounded by 29.3°-30.5° N and 81.2°-81.9° E, in the focal depth range 10 -30 km.
Documentation for the Southeast Asia seismic hazard maps
Petersen, Mark; Harmsen, Stephen; Mueller, Charles; Haller, Kathleen; Dewey, James; Luco, Nicolas; Crone, Anthony; Lidke, David; Rukstales, Kenneth
2007-01-01
The U.S. Geological Survey (USGS) Southeast Asia Seismic Hazard Project originated in response to the 26 December 2004 Sumatra earthquake (M9.2) and the resulting tsunami that caused significant casualties and economic losses in Indonesia, Thailand, Malaysia, India, Sri Lanka, and the Maldives. During the course of this project, several great earthquakes ruptured subduction zones along the southern coast of Indonesia (fig. 1) causing additional structural damage and casualties in nearby communities. Future structural damage and societal losses from large earthquakes can be mitigated by providing an advance warning of tsunamis and introducing seismic hazard provisions in building codes that allow buildings and structures to withstand strong ground shaking associated with anticipated earthquakes. The Southeast Asia Seismic Hazard Project was funded through a United States Agency for International Development (USAID)—Indian Ocean Tsunami Warning System to develop seismic hazard maps that would assist engineers in designing buildings that will resist earthquake strong ground shaking. An important objective of this project was to discuss regional hazard issues with building code officials, scientists, and engineers in Thailand, Malaysia, and Indonesia. The code communities have been receptive to these discussions and are considering updating the Thailand and Indonesia building codes to incorporate new information (for example, see notes from Professor Panitan Lukkunaprasit, Chulalongkorn University in Appendix A).
A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis
NASA Astrophysics Data System (ADS)
Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva
2018-03-01
The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.
ERIC Educational Resources Information Center
Bautista, Nazan Uludag; Peters, Kari Nichole
2010-01-01
Can students build a house that is cost effective and strong enough to survive strong winds, heavy rains, and earthquakes? First graders in Ms. Peter's classroom worked like engineers to answer this question. They participated in a design challenge that required them to plan like engineers and build strong and cost-effective houses that would fit…
UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking
Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying
2013-01-01
The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.
NASA Astrophysics Data System (ADS)
Toke, N.; Johnson, A.; Nelson, K.
2010-12-01
Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions and post them to a bulletin board. During the tectonics unit we use these preconceptions as teaching tools. We also archive the misconceptions via a website which will be available for use by the broader geoscience education community. The second student investigation focuses on understanding the impact earthquakes have on nearby cities. We use the example of the 2009 southern San Andreas Fault (SAF) shakeout scenario. Students again break into groups. Each group is given an aspect of urban infrastructure to study relative to the underlying geology and location of nearby faults. Their goal is to uncover potential urban infrastructure issues related to a major earthquake on the SAF. For example students will map transportation ways crossing the fault, the location of hospitals relative to forecasted shaking hazards, the location of poverty-stricken areas relative to shaking hazards, and utilities relative to fault crossings. Again, students are tasked with explaining their investigation and analyses to the class with ample time for discussion about potential ways to solve problems identified through their investigations.
NASA Astrophysics Data System (ADS)
Cydzik, K.; Hamilton, D.; Stenner, H. D.; Cattarossi, A.; Shrestha, P. L.
2009-12-01
The May 12, 2008 M7.9 Wenchuan Earthquake in Sichuan Province, China killed almost 90,000 people and affected a population of over 45.5 million throughout western China. Shaking caused the destruction of five million buildings, many of them homes and schools, and damaged 21 million other structures, inflicting devastating impacts to communities. Landslides, a secondary effect of the shaking, caused much of the devastation. Debris flows buried schools and homes, rock falls crushed cars, and rockslides, landslides, and rock avalanches blocked streams and rivers creating massive, unstable landslide dams, which formed “quake lakes” upstream of the blockages. Impassable roads made emergency access slow and extremely difficult. Collapses of buildings and structures large and small took the lives of many. Damage to infrastructure impaired communication, cut off water supplies and electricity, and put authorities on high alert as the integrity of large engineered dams were reviewed. During our field reconnaissance three months after the disaster, evidence of the extent of the tragedy was undeniably apparent. Observing the damage throughout Sichuan reminded us that earthquakes in the United States and throughout the world routinely cause widespread damage and destruction to lives, property, and infrastructure. The focus of this poster is to present observations and findings based on our field reconnaissance regarding the scale of earthquake destruction with respect to slope failures, landslide dams, damage to infrastructure (e.g., schools, engineered dams, buildings, roads, rail lines, and water resources facilities), human habitation within the region, and the mitigation and response effort to this catastrophe. This is presented in the context of the policy measures that could be developed to reduce risks of similar catastrophes. The rapid response of the Chinese government and the mobilization of the Chinese People’s Liberation Army to help the communities affected by the earthquake have allowed survivors to begin rebuilding their lives. However, the long-term impact of the earthquake continues to make headlines. Post-earthquake landslides and debris flows initiated by storm events have continued to impart devastation on the region. Events such as the Wenchuan Earthquake provide unique opportunities for engineers, scientists, and policy makers to collaborate for purposes of exploring the details of natural hazards and developing sound policies to protect lives and property in the future.
Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes
Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.
2013-01-01
The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.
Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems
Yashinsky, Mark
1998-01-01
This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.
Celsi, R.; Wolfinbarger, M.; Wald, D.
2005-01-01
The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.
Reduction of earthquake risk in the united states: Bridging the gap between research and practice
Hays, W.W.
1998-01-01
Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.
A global building inventory for earthquake loss estimation and risk management
Jaiswal, K.; Wald, D.; Porter, K.
2010-01-01
We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hafen, D.; Kintzer, F.C.
1977-11-01
The correlation between ground motion and building damage was investigated for the San Fernando earthquake of 1971. A series of iso-intensity maps was compiled to summarize the ground motion in terms of the Blume Engineering Intensity Scale (EIS). This involved the analysis of ground motion records from 62 stations in the Los Angeles area. Damage information for low-rise buildings was obtained in the form of records of loans granted by the Small Business Administration to repair earthquake damage. High-rise damage evaluations were based on direct inquiry and building inspection. Damage factors (ratio of damage repair cost to building value) weremore » calculated and summarized on contour maps. A statistical study was then undertaken to determine relationships between ground motion and damage factor. Several parameters for ground motion were considered and evaluated by means of correlation coefficients.« less
ShakeNet: a portable wireless sensor network for instrumenting large civil structures
Kohler, Monica D.; Hao, Shuai; Mishra, Nilesh; Govindan, Ramesh; Nigbor, Robert
2015-08-03
We report our findings from a U.S. Geological Survey (USGS) National Earthquake Hazards Reduction Program-funded project to develop and test a wireless, portable, strong-motion network of up to 40 triaxial accelerometers for structural health monitoring. The overall goal of the project was to record ambient vibrations for several days from USGS-instrumented structures. Structural health monitoring has important applications in fields like civil engineering and the study of earthquakes. The emergence of wireless sensor networks provides a promising means to such applications. However, while most wireless sensor networks are still in the experimentation stage, very few take into consideration the realistic earthquake engineering application requirements. To collect comprehensive data for structural health monitoring for civil engineers, high-resolution vibration sensors and sufficient sampling rates should be adopted, which makes it challenging for current wireless sensor network technology in the following ways: processing capabilities, storage limit, and communication bandwidth. The wireless sensor network has to meet expectations set by wired sensor devices prevalent in the structural health monitoring community. For this project, we built and tested an application-realistic, commercially based, portable, wireless sensor network called ShakeNet for instrumentation of large civil structures, especially for buildings, bridges, or dams after earthquakes. Two to three people can deploy ShakeNet sensors within hours after an earthquake to measure the structural response of the building or bridge during aftershocks. ShakeNet involved the development of a new sensing platform (ShakeBox) running a software suite for networking, data collection, and monitoring. Deployments reported here on a tall building and a large dam were real-world tests of ShakeNet operation, and helped to refine both hardware and software.
NASA Astrophysics Data System (ADS)
Özyaşar, M.; Özlüdemir, M. T.
2011-06-01
Global Navigation Satellite Systems (GNSS) are space based positioning techniques and widely used in geodetic applications. Geodetic networking accomplished by engineering surveys constitutes one of these tasks. Geodetic networks are used as the base of all kinds of geodetic implementations, Co from the cadastral plans to the relevant surveying processes during the realization of engineering applications. Geodetic networks consist of control points positioned in a defined reference frame. In fact, such positional information could be useful for other studies as well. One of such fields is geodynamic studies that use the changes of positions of control stations within a network in a certain time period to understand the characteristics of tectonic movements. In Turkey, which is located in tectonically active zones and struck by major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. For this purpose, a GPS (Global Positioning System) network of 650 stations distributed over Istanbul (Istanbul GPS Triangulation Network; abbreviated IGNA) covering the northern part of the North Anatolian Fault Zone (NAFZ) was established in 1997 and measured in 1999. From 1998 to 2004, the IGNA network was extended to 1888 stations covering an area of about 6000 km2, the whole administration area of Istanbul. All 1888 stations within the IGNA network were remeasured in 2005. In these two campaigns there existed 452 common points, and between these two campaigns two major earthquakes took place, on 17 August and 12 November 1999 with a Richter scale magnitude of 7.4 and 7.2, respectively. Several studies conducted for estimating the horizontal and vertical displacements as a result of these earthquakes on NAFZ are discussed in this paper. In geodynamic projects carried out before the earthquakes in 1999, an annual average velocity of 2-2.5 cm for the stations along the NAFZ were estimated. Studies carried out using GPS observations in the same area after these earthquakes indicated that point displacements vary depending on their distance to the epicentres of the earthquakes. But the directions of point displacements are similar. The results obtained through the analysis of the IGNA network also show that there is a common trend in the directions of point displacements in the study area. In this paper, the past studies about the tectonics of Marmara region are summarised and the results of the displacement analysis on the IGNA network are discussed.
Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.
1983-09-01
research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis
The Virtual Data Center Tagged-Format Tool - Introduction and Executive Summary
Evans, John R.; Squibb, Melinda; Stephens, Christopher D.; Savage, W.U.; Haddadi, Hamid; Kircher, Charles A.; Hachem, Mahmoud M.
2008-01-01
This Report introduces and summarizes the new Virtual Data Center (VDC) Tagged Format (VTF) Tool, which was developed by a diverse group of seismologists, earthquake engineers, and information technology professionals for internal use by the COSMOS VDC and other interested parties for the exchange, archiving, and analysis of earthquake strong-ground-motion data.
NASA Astrophysics Data System (ADS)
Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco
2017-04-01
One of the main challenges in seismically active regions is differentiating paleo-earthquakes resulting from different fault systems, such as the megathrust versus intraplate faults in subductions settings. Such differentiation is, however, key for hazard assessments based on paleoseismic records. Laguna Lo Encañado (33.7°S; 70.3°W; 2492 m a.s.l.) is located in the Central Chilean Andes, 50 km east of Santiago de Chile, a metropole with about 7,000,000 inhabitants. During the last century the study area experienced 3 large megathrust earthquakes (1906, 1985 and 2010) and 2 intraplate earthquakes (1945 and 1958) (Lomnitz, 1960). While the megathrust earthquakes cause Modified Mercalli Intensities (MMIs) of VI to VII at the lake (Van Daele et al., 2015), the intraplate earthquakes cause peak MMIs up to IX (Sepúlveda et al., 2008). Here we present a turbidite record of Laguna Lo Encañado going back to 1900 AD. While geophysical data (3.5 kHz subbottom seismic profiles and side-scan sonar data) provides a bathymetry and an overview of the sedimentary environment, we study 15 short cores in order to understand the depositional processes resulting in the encountered lacustrine turbidites. All mentioned earthquakes triggered turbidites in the lake, which are all linked to slumps in proximal areas, and are thus resulting from mass wasting of the subaquatic slopes. However, turbidites linked to the intraplate earthquakes are additionally covered by turbidites of a finer-grained, more clastic nature. We link the latter to post-seismic erosion of onshore landslides, which need higher MMIs to be triggered than subaquatic mass movements (Howarth et al., 2014). While intraplate earthquakes can cause MMIs up to IX and higher, megathrust earthquakes do not cause sufficiently high MMIs at the lake to trigger voluminous onshore landslides. Hence, the presence of these post-seismic turbidites allows to distinguish turbidites triggered by intraplate earthquakes from those triggered by megathrust earthquakes. These findings are an important step forward in the interpretation of lacustrine turbidites in subduction settings, and will eventually improve hazard assessments based on such paleoseismic records in the study area, and in other subduction zones. References Howarth et al., 2014. Lake sediments record high intensity shaking that provides insight into the location and rupture length of large earthquakes on the Alpine Fault, New Zealand. Earth and Planetary Science Letters 403, 340-351. Lomnitz, 1960. A study of the Maipo Valley earthquakes of September 4, 1958, Second World Conference on Earthquake Engineering, Tokyo and Kyoto, Japan, pp. 501-520. Sepulveda et al., 2008. New Findings on the 1958 Las Melosas Earthquake Sequence, Central Chile: Implications for Seismic Hazard Related to Shallow Crustal Earthquakes in Subduction Zones. Journal of Earthquake Engineering 12, 432-455. Van Daele et al., 2015. A comparison of the sedimentary records of the 1960 and 2010 great Chilean earthquakes in 17 lakes: Implications for quantitative lacustrine palaeoseismology. Sedimentology 62, 1466-1496.
The Engineering Strong Ground Motion Network of the National Autonomous University of Mexico
NASA Astrophysics Data System (ADS)
Velasco Miranda, J. M.; Ramirez-Guzman, L.; Aguilar Calderon, L. A.; Almora Mata, D.; Ayala Hernandez, M.; Castro Parra, G.; Molina Avila, I.; Mora, A.; Torres Noguez, M.; Vazquez Larquet, R.
2014-12-01
The coverage, design, operation and monitoring capabilities of the strong ground motion program at the Institute of Engineering (IE) of the National Autonomous University of Mexico (UNAM) is presented. Started in 1952, the seismic instrumentation intended initially to bolster earthquake engineering projects in Mexico City has evolved into the largest strong ground motion monitoring system in the region. Today, it provides information not only to engineering projects, but also to the near real-time risk mitigation systems of the country, and enhances the general understanding of the effects and causes of earthquakes in Mexico. The IE network includes more than 100 free-field stations and several buildings, covering the largest urban centers and zones of significant seismicity in Central Mexico. Of those stations, approximately one-fourth send the observed acceleration to a processing center in Mexico City continuously, and the rest require either periodic visits for the manual recovery of the data or remote interrogation, for later processing and cataloging. In this research, we document the procedures and telecommunications systems used systematically to recover information. Additionally, we analyze the spatial distribution of the free-field accelerographs, the quality of the instrumentation, and the recorded ground motions. The evaluation criteria are based on the: 1) uncertainty in the generation of ground motion parameter maps due to the spatial distribution of the stations, 2) potential of the array to provide localization and magnitude estimates for earthquakes with magnitudes greater than Mw 5, and 3) adequacy of the network for the development of Ground Motion Prediction Equations due to intra-plate and intra-slab earthquakes. We conclude that the monitoring system requires a new redistribution, additional stations, and a substantial improvement in the instrumentation and telecommunications. Finally, we present an integral plan to improve the current network's monitoring capabilities.
An Investigation on the Crustal Deformations in Istanbul after Eastern Marmara Earthquakes in 1999
NASA Astrophysics Data System (ADS)
Ozludemir, M.; Ozyasar, M.
2008-12-01
Since the introduction of the GPS technique in mid 1970's there has been great advances in positioning activities. Today such Global Navigational Satellite Systems (GNSS) based positioning techniques are widely used in daily geodetic applications. High order geodetic network measurements are one of such geodetic applications. Such networks are established to provide reliable infrastructures for all kind of geodetic work from the production of cadastral plans to the surveying processes during the construction of engineering structures. In fact such positional information obtained in such engineering surveys could be useful for other studies as well. One of such fields is geodynamic studies where such positional information could be valuable to understand the characteristics of tectonic movements. In Turkey being located in a tectonically active zones and having major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. In this paper an example of such engineering surveys is discussed. This example is the Istanbul GPS (Global Positioning System) Network, first established in 1997 and remeasured in 2005. Between these two measurement processes two major earthquakes took place, on August 17 and November 12, 1999 with magnitudes of 7.4 and 7.2, respectively. In the first measurement campaign in 1997, a network of about 700 points was measured, while in the second campaign in 2005 more than 1800 points were positioned. In these two campaigns are existing common points. The network covers the whole Istanbul area of about 6000 km2. All network points are located on the Eurasian plate to the north of the North Anatolian Fault Zone. In this study, the horizontal and vertical movements are presented and compared with the results obtained in geodynamic studies.
The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation
NASA Astrophysics Data System (ADS)
Goulet, C.; Silva, F.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.
2015-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100Hz) ground motions for earthquakes at regional scales. The BBP scientific software modules implement kinematic rupture generation, low and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, seismogram ground motion amplitude calculations, and goodness of fit measurements. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground motion seismograms, using multiple alternative ground motion simulation methods, and software utilities that can generate plots, charts, and maps. The BBP has been developed over the last five years in a collaborative scientific, engineering, and software development project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The SCEC BBP software released in 2015 can be compiled and run on recent Linux systems with GNU compilers. It includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, updated ground motion simulation methods, and a simplified command line user interface.
Adjoint-tomography for a Local Surface Structure: Methodology and a Blind Test
NASA Astrophysics Data System (ADS)
Kubina, Filip; Michlik, Filip; Moczo, Peter; Kristek, Jozef; Stripajova, Svetlana
2017-04-01
We have developed a multiscale full-waveform adjoint-tomography method for local surface sedimentary structures with complicated interference wavefields. The local surface sedimentary basins and valleys are often responsible for anomalous earthquake ground motions and corresponding damage in earthquakes. In many cases only relatively small number of records of a few local earthquakes is available for a site of interest. Consequently, prediction of earthquake ground motion at the site has to include numerical modeling for a realistic model of the local structure. Though limited, the information about the local structure encoded in the records is important and irreplaceable. It is therefore reasonable to have a method capable of using the limited information in records for improving a model of the local structure. A local surface structure and its interference wavefield require a specific multiscale approach. In order to verify our inversion method, we performed a blind test. We obtained synthetic seismograms at 8 receivers for 2 local sources, complete description of the sources, positions of the receivers and material parameters of the bedrock. We considered the simplest possible starting model - a homogeneous halfspace made of the bedrock. Using our inversion method we obtained an inverted model. Given the starting model, synthetic seismograms simulated for the inverted model are surprisingly close to the synthetic seismograms simulated for the true structure in the target frequency range up to 4.5 Hz. We quantify the level of agreement between the true and inverted seismograms using the L2 and time-frequency misfits, and, more importantly for earthquake-engineering applications, also using the goodness-of-fit criteria based on the earthquake-engineering characteristics of earthquake ground motion. We also verified the inverted model for other source-receiver configurations not used in the inversion.
Response and recovery lessons from the 2010-2011 earthquake sequence in Canterbury, New Zealand
Pierepiekarz, Mark; Johnston, David; Berryman, Kelvin; Hare, John; Gomberg, Joan S.; Williams, Robert A.; Weaver, Craig S.
2014-01-01
The impacts and opportunities that result when low-probability moderate earthquakes strike an urban area similar to many throughout the US were vividly conveyed in a one-day workshop in which social and Earth scientists, public officials, engineers, and an emergency manager shared their experiences of the earthquake sequence that struck the city of Christchurch and surrounding Canterbury region of New Zealand in 2010-2011. Without question, the earthquake sequence has had unprecedented impacts in all spheres on New Zealand society, locally to nationally--10% of the country's population was directly impacted and losses total 8-10% of their GDP. The following paragraphs present a few lessons from Christchurch.
2013-02-01
Kathmandu, Nepal 5a. CONTRACT NUMBER W911NF-12-1-0282 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER ...In the past, big earthquakes in Nepal (see Figure 1.1) have caused a huge number of casualties and damage to structures. The Great Nepal -Bihar...UBC Earthquake Engineering Research Facility 2235 East Mall, Vancouver, BC, Canada V6T 1Z4 Phone : 604 822-6203 Fax: 604 822-6901 E-mail
Gori, Paula L.
1993-01-01
INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and engineering studies. Translated earthquake hazard maps have also been developed to identify areas that are particularly vulnerable to various causes of damage such as ground shaking, surface rupturing, and liquefaction. The implementation of earthquake hazard reduction plans are now under way in various communities in Utah. The results of a survey presented in this paper indicate that technical public officials (planners and building officials) have an understanding of the earthquake hazards and how to mitigate the risks. Although the survey shows that the general public has a slightly lower concern about the potential for economic losses, they recognize the potential problems and can support a number of earthquake mitigation measures. The study suggests that many community groups along the Wasatch Front, including volunteer groups, business groups, and elected and appointed officials, are ready for action-oriented educational programs. These programs could lead to a significant reduction in the risks associated with earthquake hazards. A DATA BASE DESIGNED FOR URBAN SEISMIC HAZARDS STUDIES: A computerized data base has been designed for use in urban seismic hazards studies conducted by the U.S. Geological Survey. The design includes file structures for 16 linked data sets, which contain geological, geophysical, and seismological data used in preparing relative ground response maps of large urban areas. The data base is organized along relational data base principles. A prototype urban hazards data base has been created for evaluation in two urban areas currently under investigation: the Wasatch Front region of Utah and the Puget Sound area of Washington. The initial implementation of the urban hazards data base was accomplished on a microcomputer using dBASE III Plus software and transferred to minicomputers and a work station. A MAPPING OF GROUND-SHAKING INTENSITIES FOR SALT LAKE COUNTY, UTAH: This paper documents the development of maps showing a
The next new Madrid earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atkinson, W.
1988-01-01
Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less
DOT National Transportation Integrated Search
1994-02-01
The report contains an assessment of existing port infrastructure related to United States-Mexico trade, planned infrastructure improvements, an identification of current trade and transportation flows, and an assessment of emerging trade corridors. ...
The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault
Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.
2011-01-01
In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.
NASA Astrophysics Data System (ADS)
Perry, S.; Jordan, T.
2006-12-01
Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.
Performance evaluation of existing building structure with pushover analysis
NASA Astrophysics Data System (ADS)
Handana, MAP; Karolina, R.; Steven
2018-02-01
In the management of the infrastructure of the building, during the period of buildings common building damage as a result of several reasons, earthquakes are common. The building is planned to work for a certain service life. But during the certain service life, the building vulnerable to damage due to various things. Any damage to cultivate can be detected as early as possible, because the damage could spread, triggering and exacerbating the latest. The newest concept to earthquake engineering is Performance Based Earthquake Engineering (PBEE). PBEE divided into two, namely Performance Based Seismic Design (PBSD) and Performance Based Seismic Evaluation (PBSE). Evaluation on PBSE one of which is the analysis of nonlinear pushover. Pushover analysis is a static analysis of nonlinear where the influence of the earthquake plan on building structure is considered as burdens static catch at the center of mass of each floor, which it was increased gradually until the loading causing the melting (plastic hinge) first within the building structure, then the load increases further changes the shapes of post-elastic large it reached the condition of elastic. Then followed melting (plastic hinge) in the location of the other structured.
Historical earthquake research in Austria
NASA Astrophysics Data System (ADS)
Hammerl, Christa
2017-12-01
Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.
Rezaeian, Sanaz; Zhong, Peng; Hartzell, Stephen; Zareian, Farzin
2015-01-01
Simulated earthquake ground motions can be used in many recent engineering applications that require time series as input excitations. However, applicability and validation of simulations are subjects of debate in the seismological and engineering communities. We propose a validation methodology at the waveform level and directly based on characteristics that are expected to influence most structural and geotechnical response parameters. In particular, three time-dependent validation metrics are used to evaluate the evolving intensity, frequency, and bandwidth of a waveform. These validation metrics capture nonstationarities in intensity and frequency content of waveforms, making them ideal to address nonlinear response of structural systems. A two-component error vector is proposed to quantify the average and shape differences between these validation metrics for a simulated and recorded ground-motion pair. Because these metrics are directly related to the waveform characteristics, they provide easily interpretable feedback to seismologists for modifying their ground-motion simulation models. To further simplify the use and interpretation of these metrics for engineers, it is shown how six scalar key parameters, including duration, intensity, and predominant frequency, can be extracted from the validation metrics. The proposed validation methodology is a step forward in paving the road for utilization of simulated ground motions in engineering practice and is demonstrated using examples of recorded and simulated ground motions from the 1994 Northridge, California, earthquake.
ERIC Educational Resources Information Center
Cavlazoglu, Baki; Stuessy, Carol L.
2017-01-01
Stakeholders in STEM education have called for integrating engineering content knowledge into STEM-content classrooms. To answer the call, stakeholders in science education announced a new framework, Next Generation Science Standards, which focuses on the integration of science and engineering in K-12 science education. However, research indicates…
Update on the Center for Engineering Strong Motion Data
NASA Astrophysics Data System (ADS)
Haddadi, H. R.; Shakal, A. F.; Stephens, C. D.; Oppenheimer, D. H.; Huang, M.; Leith, W. S.; Parrish, J. G.; Savage, W. U.
2010-12-01
The U.S. Geological Survey (USGS) and the California Geological Survey (CGS) established the Center for Engineering Strong-Motion Data (CESMD, Center) to provide a single access point for earthquake strong-motion records and station metadata from the U.S. and international strong-motion programs. The Center has operational facilities in Sacramento and Menlo Park, California, to receive, process, and disseminate records through the CESMD web site at www.strongmotioncenter.org. The Center currently is in the process of transitioning the COSMOS Virtual Data Center (VDC) to integrate its functions with those of the CESMD for improved efficiency of operations, and to provide all users with a more convenient one-stop portal to both U.S. and important international strong-motion records. The Center is working with COSMOS and international and U.S. data providers to improve the completeness of site and station information, which are needed to most effectively employ the recorded data. The goal of all these and other new developments is to continually improve access by the earthquake engineering community to strong-motion data and metadata world-wide. The CESMD and its Virtual Data Center (VDC) provide tools to map earthquakes and recording stations, to search raw and processed data, to view time histories and spectral plots, to convert data files formats, and to download data and a variety of information. The VDC is now being upgraded to convert the strong-motion data files from different seismic networks into a common standard tagged format in order to facilitate importing earthquake records and station metadata to the CESMD database. An important new feature being developed is the automatic posting of Internet Quick Reports at the CESMD web site. This feature will allow users, and emergency responders in particular, to view strong-motion waveforms and download records within a few minutes after an earthquake occurs. Currently the CESMD and its Virtual Data Center provide selected strong-motion records from 17 countries. The Center has proved to be significantly useful for providing data to scientists, engineers, policy makers, and emergency response teams around the world.
Losses to single-family housing from ground motions in the 1994 Northridge, California, earthquake
Wesson, R.L.; Perkins, D.M.; Leyendecker, E.V.; Roth, R.J.; Petersen, M.D.
2004-01-01
The distributions of insured losses to single-family housing following the 1994 Northridge, California, earthquake for 234 ZIP codes can be satisfactorily modeled with gamma distributions. Regressions of the parameters in the gamma distribution on estimates of ground motion, derived from ShakeMap estimates or from interpolated observations, provide a basis for developing curves of conditional probability of loss given a ground motion. Comparison of the resulting estimates of aggregate loss with the actual aggregate loss gives satisfactory agreement for several different ground-motion parameters. Estimates of loss based on a deterministic spatial model of the earthquake ground motion, using standard attenuation relationships and NEHRP soil factors, give satisfactory results for some ground-motion parameters if the input ground motions are increased about one and one-half standard deviations above the median, reflecting the fact that the ground motions for the Northridge earthquake tended to be higher than the median ground motion for other earthquakes with similar magnitude. The results give promise for making estimates of insured losses to a similar building stock under future earthquake loading. ?? 2004, Earthquake Engineering Research Institute.
Preparing for a "Big One": The great southern California shakeout
Jones, L.M.; Benthien, M.
2011-01-01
The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million Southern Californians pretended that the magnitude-7.8 ShakeOut scenario earthquake was occurring and practiced actions derived from results of the ShakeOut Scenario, to reduce the impact of a real, San Andreas Fault event. The communications campaign was based on four principles: 1) consistent messaging from multiple sources; 2) visual reinforcement: 3) encouragement of "milling"; and 4) focus on concrete actions. The goals of the Shake-Out established in Spring 2008 were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in Southern California; and 3) to reduce earthquake losses in Southern California. Over 90% of the registrants surveyed the next year reported improvement in earthquake preparedness at their organization as a result of the ShakeOut. ?? 2011, Earthquake Engineering Research Institute.
Buchanan-Banks, Jane M.; Collins, Donley S.
1994-01-01
The heavily populated Puget Sound region in the State of Washington has experienced moderate to large earthquakes in the recent past (Nuttli, 1952; Mullineaux and others, 1967). Maps showing thickness of unconsolidated sedimentary deposits are useful aids in delineating areas where damage to engineered structures can result from increased shaking resulting from these earthquakes. Basins containing thick deposits of unconsolidated materials can amplify earthquakes waves and cause far more damage to structures than the same waves passing through bedrock (Singh and others, 1988; Algermissen and others, 1985). Configurations of deep sedimentary basins can also cause reflection and magnification of earthquake waves in ways still not fully understood and presently under investigation (Frankel and Vidale, 1992).
NASA Astrophysics Data System (ADS)
Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.
2016-12-01
As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June 2016 Mw 5.2 Borrego Springs earthquake of strong ground motions in near field close to the San Jacinto fault, as well as observations that show the response of the 3 story parking garage. The occurrence of this recent earthquake provided a useful demonstration of structural monitoring applications with seismogeodesy.
Earthquake Education in Prime Time
NASA Astrophysics Data System (ADS)
de Groot, R.; Abbott, P.; Benthien, M.
2004-12-01
Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and hazard response to create a program that is both educational and provides a public service. Seismic Sleuths and Written in Stone are the harbingers of a new genre of earthquake programs that are the antithesis of the 1974 film Earthquake and the 2004 miniseries 10.5. Film producers and those in the earthquake education community are demonstrating that it is possible to tell an exciting story, inspire awareness, and encourage empowerment without sensationalism.
Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward
1989-01-01
The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk.
Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward
1989-01-01
The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk
Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike
2011-01-01
Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.
Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures
Çelebi, Mehmet
1998-01-01
Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.
NASA Astrophysics Data System (ADS)
van der Lee, S.; Tekverk, K.; Rooney, K.; Boxerman, J.
2013-12-01
We designed and will present a lesson plan to teach students STEM concepts through seismology. The plan addresses new generation science standards in the Framework for K-12 Science Education as well AAAS Benchmarks for Science Literacy. The plan can be executed at a facility with a seismometer in a research facility or university, on a field trip, but it can also be used in a school setting with a school seismometer. Within the lesson plan, the students first use technology to obtain earthquake location data and map them. Next, the students learn about the science of earthquakes, which is followed by an engineering activity in which the students design a hypothetical seismometer and interact with the actual seismometer and live data display. Lastly the students use mathematics to locate an earthquake through trilateration. The lesson plan has been fine-tuned through implementation with over 150 students from grades 3-12 from the Chicago area.
NASA Astrophysics Data System (ADS)
Bhloscaidh, Mairead Nic; McCloskey, John; Pelling, Mark; Naylor, Mark
2013-04-01
Until expensive engineering solutions become more universally available, the objective targeting of resources at demonstrably effective, low-cost interventions might help reverse the trend of increasing mortality in earthquakes. Death tolls in earthquakes are the result of complex interactions between physical effects, such as the exposure of the population to strong shaking, and the resilience of the exposed population along with supporting critical infrastructures and institutions. The identification of socio-economic factors that contribute to earthquake mortality is crucial to identifying and developing successful risk management strategies. Here we develop a quantitative methodology more objectively to assess the ability of communities to withstand earthquake shaking, focusing on, in particular, those cases where risk management performance appears to exceed or fall below expectations based on economic status. Using only published estimates of the shaking intensity and population exposure for each earthquake, data that is available for earthquakes in countries irrespective of their level of economic development, we develop a model for mortality based on the contribution of population exposure to shaking only. This represents an attempt to remove, as far as possible, the physical causes of mortality from our analysis (where we consider earthquake engineering to reduce building collapse among the socio-economic influences). The systematic part of the variance with respect to this model can therefore be expected to be dominated by socio-economic factors. We find, as expected, that this purely physical analysis partitions countries in terms of basic socio-economic measures, for example GDP, focusing analytical attention on the power of economic measures to explain variance in observed distributions of earthquake risk. The model allows the definition of a vulnerability index which, although broadly it demonstrates the expected income-dependence of vulnerability to strong shaking, also identifies both anomalously resilient and anomalously vulnerable countries. We argue that this approach has the potential to direct sociological investigations to expose the underlying causes of the observed non-economic differentiation of vulnerability. At one level, closer study of the earthquakes represented by these data points might expose local or national interventions which are increasing resilience of communities to strong shaking in the absence of major national investment. Ultimately it may contribute to the development of a quantitative evaluation of risk management effectiveness at the national level that can be used better to target and track risk management investments.
Stability assessment of structures under earthquake hazard through GRID technology
NASA Astrophysics Data System (ADS)
Prieto Castrillo, F.; Boton Fernandez, M.
2009-04-01
This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding Metadata containing the response LFN, earthquake magnitude and maximum structure displacement is also stored. Finally, the displacements are post-processed through a statistically-based algorithm from the available Metadata to obtain the probability of collapse of the structure for different earthquake magnitudes. From this study, it is possible to build a vulnerability report for the structure type and seismic data. The proposed methodology can be combined with the on-going initiatives to build a European earthquake record database. In this context, Grid enables collaboration analysis over shared seismic data and results among different institutions.
DOT National Transportation Integrated Search
2007-02-01
This document is the conference program of the 5th National Seismic Conference on Bridges and Highways. The conference was held in San Francisco on September 18-20, 2006 and attracted over 300 engineers, academician, and students from around the worl...
Hays, Walter W.
1979-01-01
In accordance with the provisions of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124), the U.S. Geological Survey has developed comprehensive plans for producing information needed to assess seismic hazards and risk on a national scale in fiscal years 1980-84. These plans are based on a review of the needs of Federal Government agencies, State and local government agencies, engineers and scientists engaged in consulting and research, professional organizations and societies, model code groups, and others. The Earthquake Hazards Reduction Act provided an unprecedented opportunity for participation in a national program by representatives of State and local governments, business and industry, the design professions, and the research community. The USGS and the NSF (National Science Foundation) have major roles in the national program. The ultimate goal of the program is to reduce losses from earthquakes. Implementation of USGS research in the Earthquake Hazards Reduction Program requires the close coordination of responsibility between Federal, State and local governments. The projected research plan in national seismic hazards and risk for fiscal years 1980-84 will be accomplished by USGS and non-USGS scientists and engineers. The latter group will participate through grants and contracts. The research plan calls for (1) national maps based on existing methods, (2) improved definition of earthquake source zones nationwide, (3) development of improved methodology, (4) regional maps based on the improved methodology, and (5) post-earthquake investigations. Maps and reports designed to meet the needs, priorities, concerns, and recommendations of various user groups will be the products of this research and provide the technical basis for improved implementation.
NASA Astrophysics Data System (ADS)
Wein, A. M.; Berryman, K. R.; Jolly, G. E.; Brackley, H. L.; Gledhill, K. R.
2015-12-01
The 2010-2011 Canterbury Earthquake Sequence began with the 4th September 2010 Darfield earthquake (Mw 7.1). Perhaps because there were no deaths, the mood of the city and the government was that high standards of earthquake engineering in New Zealand protected us, and there was a confident attitude to response and recovery. The demand for science and engineering information was of interest but not seen as crucial to policy, business or the public. The 22nd February 2011 Christchurch earthquake (Mw 6.2) changed all that; there was a significant death toll and many injuries. There was widespread collapse of older unreinforced and two relatively modern multi-storey buildings, and major disruption to infrastructure. The contrast in the interest and relevance of the science could not have been greater compared to 5 months previously. Magnitude 5+ aftershocks over a 20 month period resulted in confusion, stress, an inability to define a recovery trajectory, major concerns about whether insurers and reinsurers would continue to provide cover, very high levels of media interest from New Zealand and around the world, and high levels of political risk. As the aftershocks continued there was widespread speculation as to what the future held. During the sequence, the science and engineering sector sought to coordinate and offer timely and integrated advice. However, other than GeoNet, the national geophysical monitoring network, there were few resources devoted to communication, with the result that it was almost always reactive. With hindsight we have identified the need to resource information gathering and synthesis, execute strategic assessments of stakeholder needs, undertake proactive communication, and develop specific information packages for the diversity of users. Overall this means substantially increased resources. Planning is now underway for the science sector to adopt the New Zealand standardised CIMS (Coordinated Incident Management System) structure for management and communication during a crisis, which should help structure and resource the science response needs in future major events.
Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management
Jaiswal, Kishor; Wald, David J.
2008-01-01
Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the contribution of building stock, its relative vulnerability, and distribution are vital components for determining the extent of casualties during an earthquake. It is evident from large deadly historical earthquakes that the distribution of vulnerable structures and their occupancy level during an earthquake control the severity of human losses. For example, though the number of strong earthquakes in California is comparable to that of Iran, the total earthquake-related casualties in California during the last 100 years are dramatically lower than the casualties from several individual Iranian earthquakes. The relatively low casualties count in California is attributed mainly to the fact that more than 90 percent of the building stock in California is made of wood and is designed to withstand moderate to large earthquakes (Kircher, Seligson and others, 2006). In contrast, the 80 percent adobe and or non-engineered masonry building stock with poor lateral load resisting systems in Iran succumbs even for moderate levels of ground shaking. Consequently, the heavy death toll for the 2003 Bam, Iran earthquake, which claimed 31,828 lives (Ghafory-Ashtiany and Mousavi, 2005), is directly attributable to such poorly resistant construction, and future events will produce comparable losses unless practices change. Similarly, multistory, precast-concrete framed buildings caused heavy casualties in the 1988 Spitak, Armenia earthquake (Bertero, 1989); weaker masonry and reinforced-concrete framed construction designed for gravity loads with soft first stories dominated losses in the Bhuj, India earthquake of 2001 (Madabhushi and Haigh, 2005); and adobe and weak masonry dwellings in Peru controlled the death toll in the Peru earthquake of 2007 (Taucer, J. and others, 2007). Spence (2007) after conducting a brief survey of most lethal earthquakes since 1960 found that building collapses remains a major cause of earthquake mortality and unreinforced masonry buildings are one of the mos
NASA Astrophysics Data System (ADS)
Yang, Kun; Xu, Quan-li; Peng, Shuang-yun; Cao, Yan-bo
2008-10-01
Based on the necessity analysis of GIS applications in earthquake disaster prevention, this paper has deeply discussed the spatial integration scheme of urban earthquake disaster loss evaluation models and visualization technologies by using the network development methods such as COM/DCOM, ActiveX and ASP, as well as the spatial database development methods such as OO4O and ArcSDE based on ArcGIS software packages. Meanwhile, according to Software Engineering principles, a solution of Urban Earthquake Emergency Response Decision Support Systems based on GIS technologies have also been proposed, which include the systems logical structures, the technical routes,the system realization methods and function structures etc. Finally, the testing systems user interfaces have also been offered in the paper.
NASA Astrophysics Data System (ADS)
Perry, S.; Benthien, M.; Jordan, T. H.
2005-12-01
The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.
Tsunami Elevation Predictions for American Samoa.
1980-09-01
tide gauge of Pago Pago after the earthquake of May 13, 1953 in Costa Rica . (Microfische Collec- tion of Tsunami Mareograms 1952-1975) July 13, 1952...34 Engineering Geology Case Histories, Geological Society of America, No. 8. Chandrasekhar, S. 1943. Reviews of Modern Physics, 15:1-89. Chen, H. S., and...Scientific abstracts and indexes relevant to earthquakes, tsunamis, and geology were also reviewed. Since there are no cumulative indexes available in most
Texas Should Require Homeland Security Standards for High-Speed Rail
2015-12-01
conditions. Japanese trains, engineered with earthquakes in mind, all came to a safe stop during the 2011 Fukushima disaster without loss of life or...building—that devastated parts of Japan through immediate effect as well as caused the consequential breach of the Fukushima nuclear reactor.119...119 Ichiro Fujisaki, “Japan’s Recovery Six Months after the Earthquake, Tsunami and Nuclear Crisis,” Brookings Institution, last modified September
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2017-07-01
Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.
3D Bedrock Structure of Bornova Plain and Its surroundings (İzmir/Western Turkey)
NASA Astrophysics Data System (ADS)
Pamuk, Eren; Gönenç, Tolga; Özdağ, Özkan Cevdet; Akgün, Mustafa
2018-01-01
An earthquake record is needed on engineering bedrock to perform soil deformation analysis. This record could be obtained in different ways (seismographs on engineering bedrock; by the help of the soil transfer function; scenario earthquakes). S-wave velocity ( V s) profile must be known at least till engineering bedrock for calculating soil transfer functions true and completely. In addition, 2D or 3D soil, engineering-seismic bedrock models are needed for soil response analyses to be carried out. These models are used to determine changes in the amplitude and frequency content of earthquake waves depending on the seismic impedance from seismic bedrock to the ground surface and the basin effects. In this context, it is important to use multiple in situ geophysical techniques to create the soil-bedrock models. In this study, 2D and 3D soil-bedrock models of Bornova plain and its surroundings (Western Turkey), which are very risky in terms of seismicity, were obtained by combined survey of surface wave and microgravity methods. Results of the study show that the engineering bedrock depths in the middle part of Bornova plain range from 200 to 400 m and the southern and northern parts which are covered limestone and andesite show the engineering bedrock ( V s > 760 m/s) feature. In addition, seismic bedrock ( V s < 3000 m/s) depth changes from 550 to 1350 m. The predominant period values obtained from single station microtremor method change from 0.45 to 1.6 s while they are higher than 1 s in the middle part of Bornova plain where the basin is deeper. Bornova Plain has a very thick sediment units which have very low V s values above engineering bedrock. In addition, it is observed sudden changes at the interfaces of the layer in horizontal and vertical directions.
NASA Astrophysics Data System (ADS)
Karabulut, Savas; Cinku, Mualla; Tezel, Okan; Hisarli, Mumtaz; Ozcep, Ferhat; Tun, Muammer; Avdan, Ugur; Ozel, Oguz; Acikca, Ahmet; Aygordu, Ozan; Benli, Aral; Kesisyan, Arda; Yilmaz, Hakan; Varici, Cagri; Ozturkan, Hasan; Ozcan, Cuneyt; Kivrak, Ali
2015-04-01
Social Responsibility Projects (SRP) are important tools in contributing to the development of communities and applied educational science. Researchers dealing with engineering studies generally focus on technical specifications. However, when the subject depends on earthquake, engineers should be consider also social and educational components, besides the technical aspects. If scientific projects collaborated with municipalities of cities, it should be known that it will reach a wide range of people. Turkey is one of the most active region that experienced destructive earthquakes. The 1999 Marmara earthquake was responsible for the loose of more than 18.000 people. The destructive damage occurred on buildings that made on problematic soils. This however, is still the one of most important issues in Turkey which needs to be solved. Inspite of large earthquakes that occurred along the major segments of the North and East Anatolian Fault Zones due to the northwards excursion of Anatolia, the extensional regime in the Aegean region is also characterized by earthquakes that occurred with the movement of a number of strike slip and normal faults. The Dikili village within the Eastern Aegean extensional region experienced a large earthquake in 1939 (M: 6.8). The seismic activity is still characterised by high level and being detected. A lot of areas like the Kabakum village have been moved to its present location during this earthquake. The probability of an earthquake hazard in Dikili is considerably high level, today. Therefore, it is very important to predict the soil behaviour and engineering problems by using Geographic Information System (GIS) tools in this area. For this purpose we conducted a project with the collaboration of the Dikili Municipality in İzmir (Turkey) to determine the following issues: a) Possible disaster mitigation as a result of earthquake-soil-structure interaction, b) Geo-enginnering problems (i.e: soil liquefaction, soil settlement, soil bearing capacity, soil amplification), c) The basin structure and possible fault of the Dikili district, d) Risk analysis on cultivated areas due to salty water injection, e) The tectonic activity of the study area from Miocene to present. During this study a number of measurements were carried out to solve the problems defined above. These measurements include; microtremor single station (H/V) method according to Nakamura's technique, which is applied at 222 points. The results provide maps of soil fundamental frequency, soil amplification and soil sedimentary thickness by using developed amprical relationships. Spatial Autocorrelation Technique (SPAC) was carried out in 11 sites with Guralp CG-5 seismometer to predict the shear wave velocity-depth model towards the sismological bedrock. Multi-channel analysis of Surface Wave (MASW), Microtremor Array Method (MAM) and Seismic Refraction Method were applied at 121 sites with SARA-Doremi Seismograph. The soil liquefaction-induced settlements are determined in the frame of shallow soil engineering problems. Vertical Electrical Sounding (VES) was carried out to define the presence of saltly and drinkable and hot/cold underground water, the location of possible faults and the bedrock depth which was estimated with a Scientrex Saris Resistivity Equipment. To define the areas which are influenced by salty water, induced polarization (IP) method was applied at 34 sites. The basin structure and the probably faults of the study area were determined by applying gravity measurements on 248 points with a CG-5 Autogravity meter. Evaluation of the combined data is very important for producing microzonation maps. We therefore integrated all of the data into the GIS database and prepared large variety of maps.
Earthquakes trigger the loss of groundwater biodiversity
NASA Astrophysics Data System (ADS)
Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero
2014-09-01
Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.
NASA Astrophysics Data System (ADS)
Norbeck, Jack H.; Horne, Roland N.
2018-05-01
The maximum expected earthquake magnitude is an important parameter in seismic hazard and risk analysis because of its strong influence on ground motion. In the context of injection-induced seismicity, the processes that control how large an earthquake will grow may be influenced by operational factors under engineering control as well as natural tectonic factors. Determining the relative influence of these effects on maximum magnitude will impact the design and implementation of induced seismicity management strategies. In this work, we apply a numerical model that considers the coupled interactions of fluid flow in faulted porous media and quasidynamic elasticity to investigate the earthquake nucleation, rupture, and arrest processes for cases of induced seismicity. We find that under certain conditions, earthquake ruptures are confined to a pressurized region along the fault with a length-scale that is set by injection operations. However, earthquakes are sometimes able to propagate as sustained ruptures outside of the zone that experienced a pressure perturbation. We propose a faulting criterion that depends primarily on the state of stress and the earthquake stress drop to characterize the transition between pressure-constrained and runaway rupture behavior.
Earthquakes trigger the loss of groundwater biodiversity.
Galassi, Diana M P; Lombardo, Paola; Fiasca, Barbara; Di Cioccio, Alessia; Di Lorenzo, Tiziana; Petitta, Marco; Di Carlo, Piero
2014-09-03
Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and "ecosystem engineers", we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.
Incorporating natural hazard assessments into municipal master-plans; case-studies from Israel
NASA Astrophysics Data System (ADS)
Katz, Oded
2010-05-01
The active Dead Sea Rift (DSR) runs along the length of Israel, making the entire state susceptible to earthquake-related hazards. Current building codes generally acknowledge seismic hazards and direct engineers towards earthquake-resistant structures. However, hazard mapping on a scale fit for municipal/governmental planning is subject to local initiative and is currently not mandatory as seems necessary. In the following, a few cases of seismic-hazard evaluation made by the Geological Survey of Israel are presented, emphasizing the reasons for their initiation and the way results were incorporated (or not). The first case is a seismic hazard qualitative micro-zonation invited by the municipality of Jerusalem as part of a new master plan. This work resulted in maps (1:50,000; GIS format) identifying areas prone to (1) amplification of seismic shaking due to site characteristics (outcrops of soft rocks or steep topography) and (2) sites with earthquake induced landslide (EILS) hazard. Results were validated using reports from the 1927, M=6.2 earthquake that originated along the DSR about 30km east of Jerusalem. Although the hazard maps were accepted by municipal authorities, practical use by geotechnical engineers working within the frame of the new master-plan was not significant. The main reason for that is apparently a difference of opinion between the city-engineers responsible for implementing the new master-plan and the geologists responsible of the hazard evaluation. The second case involves evaluation of EILS hazard for two towns located further north along the DSR, Zefat and Tiberias. Both were heavily damaged more than once by strong earthquakes in past centuries. Work was carried out as part of a governmental seismic-hazard reduction program. The results include maps (1:10,000 scales) of sites with high EILS hazard identified within city limits. Maps (in GIS format) were sent to city engineers with reports explaining the methods and results. As far as we know, widespread implementation of the maps within municipal master plans never came about, and there was no open discussion between city engineers and the Geological Survey. The main reasons apparently are (1) a lack, until recently, of mandatory building codes requiring incorporation of EILS hazard; (2) budget priorities; (3) failure to involve municipality personnel in planning and executing the EILS hazard evaluation. These cases demonstrate that for seismic hazard data to be incorporated and implemented within municipal master-plans there needs to be (1) active involvement of municipal officials and engineers from the early planning stages of the evaluation campaign, and (2) a-priori dedication of funds towards implementation of evaluation results.
2008 United States National Seismic Hazard Maps
Petersen, M.D.; ,
2008-01-01
The U.S. Geological Survey recently updated the National Seismic Hazard Maps by incorporating new seismic, geologic, and geodetic information on earthquake rates and associated ground shaking. The 2008 versions supersede those released in 1996 and 2002. These maps are the basis for seismic design provisions of building codes, insurance rate structures, earthquake loss studies, retrofit priorities, and land-use planning. Their use in design of buildings, bridges, highways, and critical infrastructure allows structures to better withstand earthquake shaking, saving lives and reducing disruption to critical activities following a damaging event. The maps also help engineers avoid costs from over-design for unlikely levels of ground motion.
New Zealand’s deadliest quake sounds alarm for cities on fault lines
Kalkan, Erol
2012-01-01
The catastrophic Christ Church Earthquake is a strong reminder to engineers and scientists of the hazards pose by fault lines, both mapped and unknown, near major cities. In February 2011, the relatively moderate earthquake that struck the cities of Christchurch and Lyttleton in the Canterbury region of New Zealand's South Island surprised many with its destructive power. The magnitude 6.2 temblor killed 181 people, 118 of whom were killed in the collapse of a single building in the city center. The quake damaged or destroyed more than 100,000 buildings.It was the deadliest quake to strike the nation in 80 years-since the 1931 earthquake that struck the Napier and Hastings area of the North Island. The Christchurch quake was part of the aftershock sequence following the September 2010 magnitude 7.1 earthquake near Darfield, 40 kilometers west of the city. The Darfield earthquake was in a sparsely populated area, causing to loss of life. By contrast, the Christchurch earthquake was generated on a fault in close proximity to the city.
"Did you feel it?" Intensity data: A surprisingly good measure of earthquake ground motion
Atkinson, G.M.; Wald, D.J.
2007-01-01
The U.S. Geological Survey is tapping a vast new source of engineering seismology data through its "Did You Feel It?" (DYFI) program, which collects online citizen responses to earthquakes. To date, more than 750,000 responses have been compiled in the United States alone. The DYFI data make up in quantity what they may lack in scientific quality and offer the potential to resolve longstanding issues in earthquake ground-motion science. Such issues have been difficult to address due to the paucity of instrumental ground-motion data in regions of low seismicity. In particular, DYFI data provide strong evidence that earthquake stress drops, which control the strength of high-frequency ground shaking, are higher in the central and eastern United States (CEUS) than in California. Higher earthquake stress drops, coupled with lower attenuation of shaking with distance, result in stronger overall shaking over a wider area and thus more potential damage for CEUS earthquakes in comparison to those of equal magnitude in California - a fact also definitively captured with these new DYFI data and maps.
Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward
1989-01-01
The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk
ERIC Educational Resources Information Center
Selin, Helaine
1993-01-01
Describes scientific and technical accomplishments of the Chinese in developing earthquake detection procedures, paper making, and medicine and of Islamic people in developing astronomy and mechanical engineering. (PR)
NASA Astrophysics Data System (ADS)
Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes
2010-05-01
Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the outermost layer was treated this way, the core of the shrines was made of simple rectangular blocks. The system resisted both in-plane and out-of-plane shaking quite well, as proven by survival of many shrines for more than a millennium, and by fracturing of blocks instead of displacement during the 2006 Yogyakarta earthquake. Systematic use or disuse of known earthquake-resistant techniques in any one society depends on the perception of earthquake risk and on available financial resources. Earthquake-resistant construction practice is significantly more expensive than regular construction. Perception is influenced mostly by short individual and longer social memory. If earthquake recurrence time is longer than the preservation of social memory, if damaging quakes fade into the past, societies commit the same construction mistakes again and again. Length of the memory is possibly about a generation's lifetime. Events occurring less frequently than 25-30 years can be readily forgotten, and the risk of recurrence considered as negligible, not worth the costs of safe construction practices. (Example of recurring flash floods in Hungary.) Frequent earthquakes maintain safe construction practices, like the Java masonry technique throughout at least two centuries, and like the Fachwerk tradition on Modern Aegean Samos throughout 500 years of political and technological development. (OTKA K67583)
Topographical and geological amplification: case studies and engineering implications
Celebi, M.
1991-01-01
Topographical and geological amplification that occurred during past earthquakes are quantified using spectral ratios of recorded motions. Several cases are presented from the 1985 Chilean and Mexican earthquakes as well as the 1983 Coalinga (California) and 1987 Supersition Hills (California) earthquake. The strong motions recorded in Mexico City during the 1985 Michoacan earthquake are supplemented by ambient motions recorded within Mexico City to quantify the now well known resonating frequencies of the Mexico City lakebed. Topographical amplification in Canal Beagle (Chile), Coalinga and Superstition Hills (California) are quantified using the ratios derived from the aftershocks following the earthquakes. A special dense array was deployed to record the aftershocks in each case. The implications of both geological and topographical amplification are discussed in light of current code provisions. The observed geological amplifications has already influenced the code provisions. Suggestions are made to the effect that the codes should include further provisions to take the amplification due to topography into account. ?? 1991.
Haeussler, Peter J.; Schwartz, D.P.; Dawson, T.E.; Stenner, Heidi D.; Lienkaemper, J.J.; Cinti, F.; Montone, Paola; Sherrod, B.; Craw, P.
2004-01-01
On 3 November 2002, an M7.9 earthquake produced 340 km of surface rupture on the Denali and two related faults in Alaska. The rupture proceeded from west to east and began with a 40-km-long break on a previously unknown thrust fault. Estimates of surface slip on this thrust are 3-6 m. Next came the principal surface break along ???218 km of the Denali fault. Right-lateral offsets averaged around 5 m and increased eastward to a maximum of nearly 9 m. The fault also ruptured beneath the trans-Alaska oil pipeline, which withstood almost 6 m of lateral offset. Finally, slip turned southeastward onto the Totschunda fault. Right-lateral offsets are up to 3 m, and the surface rupture is about 76 km long. This three-part rupture ranks among the longest strike-slip events of the past two centuries. The earthquake is typical when compared to other large earthquakes on major intracontinental strike-slip faults. ?? 2004, Earthquake Engineering Research Institute.
Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?
NASA Astrophysics Data System (ADS)
Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.
2017-03-01
Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.
Saving lives through better design standards
Çelebi, Mehmet; Spudich, Paul A.; Page, Robert A.; Stauffer, Peter H.
1995-01-01
Over the past 30 years, scientists have put together a more complete picture of how the ground shakes during earthquakes. They have learned that shaking near the source of earthquakes is far more severe than once thought and that soft ground shakes more strongly than hard rock.This knowledge has enabled engineers to improve design standards so that structures arebetter able to survive strong earthquakes. When the 1989 Loma Prieta earthquake struck, 42 people tragically lost their lives in the collapse of a half-mile-long section of the Cypress structure, an elevated double-decker freeway in Oakland, California.Yet adjacent parts of this structure withstood the magnitude 6.9 temblor—why? The part that collapsed was built on man-made fill over soft mud, whereas adjacent sections stood on older, firmer sand and gravel deposits. Following the collapse, scientists set out instruments in the area to record the earthquake's many strong aftershocks. These instruments showed that the softer ground shook more forcefully than the firmer material-even twice as violently
NASA Astrophysics Data System (ADS)
Fang, Yi; Huang, Yahong
2017-12-01
Conducting sand liquefaction estimated based on codes is the important content of the geotechnical design. However, the result, sometimes, fails to conform to the practical earthquake damages. Based on the damage of Tangshan earthquake and engineering geological conditions, three typical sites are chosen. Moreover, the sand liquefaction probability was evaluated on the three sites by using the method in the Code for Seismic Design of Buildings and the results were compared with the sand liquefaction phenomenon in the earthquake. The result shows that the difference between sand liquefaction estimated based on codes and the practical earthquake damage is mainly attributed to the following two aspects: The primary reasons include disparity between seismic fortification intensity and practical seismic oscillation, changes of groundwater level, thickness of overlying non-liquefied soil layer, local site effect and personal error. Meanwhile, although the judgment methods in the codes exhibit certain universality, they are another reason causing the above difference due to the limitation of basic data and the qualitative anomaly of the judgment formulas.
Ji, C.; Helmberger, D.V.; Wald, D.J.
2004-01-01
Slip histories for the 2002 M7.9 Denali fault, Alaska, earthquake are derived rapidly from global teleseismic waveform data. In phases, three models improve matching waveform data and recovery of rupture details. In the first model (Phase I), analogous to an automated solution, a simple fault plane is fixed based on the preliminary Harvard Centroid Moment Tensor mechanism and the epicenter provided by the Preliminary Determination of Epicenters. This model is then updated (Phase II) by implementing a more realistic fault geometry inferred from Digital Elevation Model topography and further (Phase III) by using the calibrated P-wave and SH-wave arrival times derived from modeling of the nearby 2002 M6.7 Nenana Mountain earthquake. These models are used to predict the peak ground velocity and the shaking intensity field in the fault vicinity. The procedure to estimate local strong motion could be automated and used for global real-time earthquake shaking and damage assessment. ?? 2004, Earthquake Engineering Research Institute.
Rapid earthquake hazard and loss assessment for Euro-Mediterranean region
NASA Astrophysics Data System (ADS)
Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru
2010-10-01
The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.
Lee, William H K.
2016-01-01
Rotational seismology is an emerging study of all aspects of rotational motions induced by earthquakes, explosions, and ambient vibrations. It is of interest to several disciplines, including seismology, earthquake engineering, geodesy, and earth-based detection of Einstein’s gravitation waves.Rotational effects of seismic waves, together with rotations caused by soil–structure interaction, have been observed for centuries (e.g., rotated chimneys, monuments, and tombstones). Figure 1a shows the rotated monument to George Inglis observed after the 1897 Great Shillong earthquake. This monument had the form of an obelisk rising over 19 metres high from a 4 metre base. During the earthquake, the top part broke off and the remnant of some 6 metres rotated about 15° relative to the base. The study of rotational seismology began only recently when sensitive rotational sensors became available due to advances in aeronautical and astronomical instrumentations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shinozuka, M.; Rose, A.; Eguchi, R.T.
1998-12-31
This monograph examines the potential effects of a repeat of the New Madrid earthquake to the metropolitan Memphis area. The authors developed a case study of the impact of such an event to the electric power system, and analyzed how this disruption would affect society. In nine chapters and 189 pages, the book traces the impacts of catastrophic earthquakes through a curtailment of utility lifeline services to its host regional economy and beyond. the monographs` chapters include: Modeling the Memphis economy; seismic performance of electric power systems; spatial analysis techniques for linking physical damage to economic functions; earthquake vulnerability andmore » emergency preparedness among businesses; direct economic impacts; regional economic impacts; socioeconomic and interregional impacts; lifeline risk reduction; and public policy formulation and implementation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savich, A. I., E-mail: office@geodyn.ru; Burdina, N. A., E-mail: nina-burdina@mail.ru
Analysis of published data on the fundamental parameters of actual accelerograms of strong earthquakes having peak ground acceleration A{sub max}, predominant period T{sub pr}, and duration τ{sub 0.5} at 0.5A{sub max} determined that, for earthquakes of intensity greater than 6.5 – 7.0, the relationship between these quantities is sufficiently well described by the parameters B = ATτ and C = AτT{sup −1.338}, the former of which depends little on earthquake intensity I and is almost completely determined by the earthquake magnitude, while the latter, on the contrary, weakly depends on magnitude and is determined principally by the quantity I. Methodsmore » are proposed for using the parameters B and C to improve the reliability of determining parameters of accelerograms used to calculate the seismic resistance of hydraulic engineering facilities.« less
Towards to Resilience Science -Research on the Nankai trough seismogenic zone-
NASA Astrophysics Data System (ADS)
Kaneda, Yoshiyuki; Shiraki, Wataru; Fujisawa, Kazuhito; Tokozakura, Eiji
2017-04-01
For the last few decades, many destructive earthquakes and tsunamis occurred in the world. Based on lessons learnt from 2004 Sumatra Earthquake/Tsunamis, 2010 Chilean Earthquake/Tsunami and 2011 East Japan Earthquake/Tsunami, we recognized the importance of real time monitoring on Earthquakes and Tsunamis for disaster mitigation. Recently, Kumamoto Earthquake occurred in 2006. This destructive Earthquake indicated that multi strong motions including pre shock and main shock generated severe earthquake damages buildings. Furthermore, we recognize recovers/ revivals are very important and difficult. In Tohoku area damaged by large tsunamis, recovers/revivals have been under progressing after over 5 years passed after the 2011 Tohoku Earthquake. Therefore, we have to prepare the pre plan before next destructive disasters such as the Nankai trough mega thrust earthquake. As one of disaster countermeasures, we would like to propose that Disaster Mitigation Science. This disaster mitigation science is including engineering, science, medicine and social science such as sociology, informatics, law, literature, art, psychology etc. For Urgent evacuations, there are some kinds of real time monitoring system such as Dart buoy and ocean floor network. Especially, the real time monitoring system using multi kinds of sensors such as the accelerometer, broadband seismometer, pressure gauge, difference pressure gauge, hydrophone and thermometer is indispensable for Earthquakes/ Tsunamis monitoring. Furthermore, using multi kind of sensors, we can analyze and estimate broadband crustal activities around mega thrust earthquake seismogenic zones. Therefore, we deployed DONET1 and DONET2 which are dense ocean floor networks around the Nankai trough Southwestern Japan. We will explain about Resilience Science and real time monitoring systems around the Nankai trough seismogenic zone.
A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM
NASA Astrophysics Data System (ADS)
Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.
2007-12-01
The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.
Assessing the Utility of and Improving USGS Earthquake Hazards Program Products
NASA Astrophysics Data System (ADS)
Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.
2010-12-01
A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.
DOT National Transportation Integrated Search
1995-01-01
In order to obtain regional perspective on the major problems and issues to be addressed, a series of nine regional round tables were convened across the nation. One of these was held in Norfolk, VA, on June 11, 1993. The primary focus of this meetin...
Changes in Science Teachers' Conceptions and Connections of STEM Concepts and Earthquake Engineering
ERIC Educational Resources Information Center
Cavlazoglu, Baki; Stuessy, Carol
2017-01-01
The authors find justification for integrating science, technology, engineering, and mathematics (STEM) in the complex problems that today's students will face as tomorrow's STEM professionals. Teachers with individual subject-area specialties in the STEM content areas have limited experience in integrating STEM. In this study, the authors…
Medical equipment donations in Haiti: flaws in the donation process.
Dzwonczyk, Roger; Riha, Chris
2012-04-01
The magnitude 7.0 earthquake that struck Haiti on 12 January 2010 devastated the capital city of Port-au-Prince and the surrounding area. The area's hospitals suffered major structural damage and material losses. Project HOPE sought to rebuild the medical equipment and clinical engineering capacity of the country. A team of clinical engineers from the United States of America and Haiti conducted an inventory and assessment of medical equipment at seven public hospitals affected by the earthquake. The team found that only 28% of the equipment was working properly and in use for patient care; another 28% was working, but lay idle for technical reasons; 30% was not working, but repairable; and 14% was beyond repair. The proportion of equipment in each condition category was similar regardless of whether the equipment was present prior to the earthquake or was donated afterwards. This assessment points out the flaws that existed in the medical equipment donation process and reemphasizes the importance of the factors, as delineated by the World Health Organization more than a decade ago, that constitute a complete medical equipment donation.
2D soil and engineering-seismic bedrock modeling of eastern part of Izmir inner bay/Turkey
NASA Astrophysics Data System (ADS)
Pamuk, Eren; Akgün, Mustafa; Özdağ, Özkan Cevdet; Gönenç, Tolga
2017-02-01
Soil-bedrock models are used as a base when the earthquake-soil common behaviour is defined. Moreover, the medium which is defined as bedrock is classified as engineering and seismic bedrock in itself. In these descriptions, S-wave velocity is (Vs) used as a base. The mediums are called soil where the Vs is < 760 m/s, the bigger ones are called bedrock as well. Additionally, the parts are called engineering bedrock where the Vs is between 3000 m/s and 760 m/s, the parts where are bigger than 3000 m/s called seismic bedrock. The interfacial's horizontal topography where is between engineering and seismic bedrock is effective on earthquake's effect changing on the soil surface. That's why, 2D soil-bedrock models must be used to estimate the earthquake effect that could occur on the soil surface. In this research, surface wave methods and microgravity method were used for occuring the 2D soil-bedrock models in the east of İzmir bay. In the first stage, velocity values were obtained by the studies using surface wave methods. Then, density values were calculated from these velocity values by the help of the empiric relations. 2D soil-bedrock models were occurred based upon both Vs and changing of density by using these density values in microgravity model. When evaluating the models, it was determined that the soil is 300-400 m thickness and composed of more than one layers in parts where are especially closer to the bay. Moreover, it was observed that the soil thickness changes in the direction of N-S. In the study area, geologically, it should be thought the engineering bedrock is composed of Bornova melange and seismic bedrock unit is composed of Menderes massif. Also, according to the geophysical results, Neogene limestone and andesite units at between 200 and 400 m depth show that engineering bedrock characteristic.
Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building
Kohler, M.D.; Davis, P.M.; Safak, E.
2005-01-01
Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.
NASA Astrophysics Data System (ADS)
Sugimoto, Megumi
2015-04-01
The March 11, 2011 Tohoku earthquake and its tsunami killed 18,508 people, including the missing (National Police Agency report as of April 2014) and raise the Level 7 accident at TEPCO's Fukushima Dai-ichi nuclear power station in Japan. The problems revealed can be viewed as due to a combination of risk-management, risk-communication, and geoethics issues. Japan's preparations for earthquakes and tsunamis are based on the magnitude of the anticipated earthquake for each region. The government organization coordinating the estimation of anticipated earthquakes is the "Headquarters for Earthquake Research Promotion" (HERP), which is under the Ministry of Education, Culture, Sports, Science and Technology (MEXT). Japan's disaster mitigation system is depicted schematically as consisting of three layers: seismology, civil engineering, and disaster mitigation planning. This research explains students in geoscience should study geoethics as part of their education related Tohoku earthquake and the Level 7 accident at TEPCO's Fukushima Dai-ichi nuclear power station. Only when they become practicing professionals, they will be faced with real geoethical dilemmas. A crisis such as the 2011 earthquake, tsunami, and Fukushima Dai-ichi nuclear accident, will force many geoscientists to suddenly confront previously unanticipated geoethics and risk-communication issues. One hopes that previous training will help them to make appropriate decisions under stress. We name it "decision science".
Boatwright, J.; Bundock, H.; Seekins, L.C.
2006-01-01
We derive and test relations between the Modified Mercalli Intensity (MMI) and the pseudo-acceleration response spectra at 1.0 and 0.3 s - SA(1.0 s) and SA(0.3 s) - in order to map response spectral ordinates for the 1906 San Francisco earthquake. Recent analyses of intensity have shown that MMI ??? 6 correlates both with peak ground velocity and with response spectra for periods from 0.5 to 3.0 s. We use these recent results to derive a linear relation between MMI and log SA(1.0 s), and we refine this relation by comparing the SA(1.0 s) estimated from Boatwright and Bundock's (2005) MMI map for the 1906 earthquake to the SA(1.0 s) calculated from recordings of the 1989 Loma Prieta earthquake. South of San Jose, the intensity distributions for the 1906 and 1989 earthquakes are remarkably similar, despite the difference in magnitude and rupture extent between the two events. We use recent strong motion regressions to derive a relation between SA(1.0 s) and SA(0.3 s) for a M7.8 strike-slip earthquake that depends on soil type, acceleration level, and source distance. We test this relation by comparing SA(0.3 s) estimated for the 1906 earthquake to SA(0.3 s) calculated from recordings of both the 1989 Loma Prieta and 1994 Northridge earthquakes, as functions of distance from the fault. ?? 2006, Earthquake Engineering Research Institute.
Wald, David J.
2010-01-01
This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.
Marano, K.D.; Wald, D.J.; Allen, T.I.
2010-01-01
This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.
Mineral resources of the Cabinet Mountains Wilderness, Lincoln and Sanders Counties, Montana
Lindsey, David A.; Wells, J.D.; Van Loenen, R. E.; Banister, D.P.; Welded, R.D.; Zilka, N.T.; Schmauch, S.W.
1978-01-01
This report describes the differential array, of seismometers recently installed at the Hollister, California, Municipal Airport. Such an array of relatively closely spaced seismometers has already been installed in El Centro and provided useful information for both engineering and seismological applications from the 1979 Imperial Valley earthquake. Differential ground motions, principally due to horizontally propagating surface waves, are important in determining the stresses in such extended structures as large mat foundations for nuclear power stations, dams, bridges and pipelines. Further, analyses of the records of the 1979 Imperial Valley earthquake from the differential array have demonstrated the utility of short-baseline array data in tracking the progress of the rupture wave front of an earthquake.
1987-09-01
and Sponheuer, W. 1969. Scale of Seismic Intensity: Proc. Fourth World Conf. on Earthquake Engineering, Santiago, Chile . Murphy, J. R., and O’Brien, L...Predom V/H el, V/I Vel V/H Displ V/H sec VIH Period Period Predom Accel cm/sec Vel cm Disp .05 Dur sec sec Period S11 2 0.48 MODIFIED MERCALLI INTENSITY...0.1 0. 0.16 142.20 Long. Vert Hor Vert Ratio Ratio Vert Ratio Vert r io Du r atio Predom Predom VIH Acce V/H Vel V /H Dspi V H sec 1, H Period Period
Brown, R.D.
1990-01-01
The geologic limitations for building sites of some areas can be overcome, in part, by skilled engineering and expensive construction practices. But the costs can be prohibitively high, and the solutions are not always completely effective. In "earthquake country," history has shown that costs are highest and risk factors most uncertain in a few easily recognized settings: unstable hill sloped, land at the edge of rapidly eroding sea cliffs, lowlands underlain by saturated estuarine mud of ill, and areas near faults capable of producing magnitude 7 or greater earthquakes. Safety immediately after an earthquake is also a concern in these places, for extreme damage and ground distortion may impede or prevent timely access by emergency equipment.
Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering
NASA Astrophysics Data System (ADS)
Cavlazoglu, Baki; Stuessy, Carol
2018-02-01
The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.
Geological-Seismological Evaluation of Earthquake Hazards at J. Strom Thurmond Dam
1993-08-01
14-08-001- 14553, 92 p. Talwani, P., Stevenson, D., Sauber , 3., Rastogl,_ B.K., :Drew, A., Chiang, 3. and Amick, D., 1978. Seismicity studies at Lake...crust in the eastern United States, final report, Engineers International, Inc., prepared for U.S. Nuclear Regulatory Commission, 75 pp. Sauber , Jeanne...studies following the 2 August 1974 South Carolina earthquake, E Notes, 46, No. 4, 21-28. Talwani, P., D. Stevenson, J. Chiang, J. Sauber , and D. Amick
NASA Astrophysics Data System (ADS)
Afifuddin, M.; Panjaitan, M. A. R.; Ayuna, D.
2017-02-01
Earthquakes are one of the most dangerous, destructive and unpredictable natural hazards, which can leave everything up to a few hundred kilometres in complete destruction in seconds. Indonesia has a unique position as an earthquake prone country. It is the place of the interaction for three tectonic plates, namely the Indo-Australian, Eurasian and Pacific plates. Banda Aceh is one of the cities that located in earthquake-prone areas. Due to the vulnerable conditions of Banda Aceh some efforts have been exerted to reduce these unfavourable conditions. Many aspects have been addressed, starting from community awareness up to engineering solutions. One of them is all buildings that build in the city should be designed as an earthquake resistant building. The objectives of this research are to observe the response of a reinforced concrete structure due to several types of earthquake load, and to see the performance of the structure after earthquake loads applied. After Tsunami in 2004 many building has been build, one of them is a hotel building located at simpang lima. The hotel is made of reinforced concrete with a height of 34.95 meters with a total area of 8872.5 m2 building. So far this building was the tallest building in Banda Aceh.
Caribbean Engineer and Environmental Conference (CSL Issue Paper, Volume 14-08, November 2008)
2008-11-01
Leadership, conducted a successful four day Engineer and Environment Conference between 2 and 5 September 2008 in San Jose , Puerto Rico. The purpose of...ANSI Std Z39-18 CONFERENCE SCHEDULE The Caribbean Engineer and Environmental Conference was conducted at the El San Juan Hotel in San Juan, Puerto...i.e. tsunamis, earthquakes, and volcanoes • Officials must develop ways to overcome communication problems between civil and military assets• Overall
An earthquake strength scale for the media and the public
Johnston, A.C.
1990-01-01
A local engineer, E.P Hailey, pointed this problem out to me shortly after the Loma Prieta earthquake. He felt that three problems limited the usefulness of magnitude in describing an earthquake to the public; (1) most people don't understand that it is not a linear scale; (2) of those who do realized the scale is not linear, very few understand the difference of a factor of ten in ground motion and 32 in energy release between points on the scale; and (3) even those who understand the first two points have trouble putting a given magnitude value into terms they can relate to. In summary, Mr. Hailey wondered why seismologists can't come up with an earthquake scale that doesn't confuse everyone and that conveys a sense of true relative size. Here, then, is m attempt to construct such a scale.
Geotechnical aspects of the January 2003 Tecoma'n, Mexico, earthquake
Wartman, Joseph; Rodriguez-Marek, Adrian; Macari, Emir J.; Deaton, Scott; Ramirez-Reynaga, Marti'n; Ochoa, Carlos N.; Callan, Sean; Keefer, David; Repetto, Pedro; Ovando-Shelley, Efrai'n
2005-01-01
Ground failure was the most prominent geotechnical engineering feature of the 21 January 2003 Mw 7.6 Tecoma´n earthquake. Ground failure impacted structures, industrial facilities, roads, water supply canals, and other critical infrastructure in the state of Colima and in parts of the neighboring states of Jalisco and Michoaca´n. Landslides and soil liquefaction were the most common type of ground failure, followed by seismic compression of unsaturated materials. Reinforced earth structures generally performed well during the earthquake, though some structures experienced permanent lateral deformations up to 10 cm. Different ground improvement techniques had been used to enhance the liquefaction resistance of several sites in the region, all of which performed well and exhibited no signs of damage or significant ground deformation. Earth dams in the region experienced some degree of permanent deformation but remained fully functional after the earthquake.
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Rourke, M.J.; Ballantyne, D.
The document focuses on earthquake damage to water and oil pipelines, water supply, and water treatment following the 22 April 1991 Costa Rica Earthquake. The moment magnitude 7.5 earthquake occurred approximately 40 km south-southwest of Limon, and resulted in a coseismic uplift of up to 1.5 meters along Costa Rica's east coast. The report also provides an overview of the engineering aspects of the event and recovery activities. Turbidity in the watershed which provides Limon's primary water supply increased to as high as 2.4 percent solids, making it extremely difficult to treat. In addition, the water treatment plant was damagedmore » by the earthquake. Cast iron, ductile iron and reinforced concrete cylinder pipe water transmission lines were damaged by both wave propagation and permanent ground deformation. Water distribution piping, also including PVC and galvanized iron, was similarly impacted. Documentation and evaluation of that damage is described, and compared with empirical estimates from previous earthquakes. Twin 150 mm (6 in), 100 km long, oil transmission lines suffered only a single failure from wrinkling. A description of the pipelines and the failure is provided.« less
Comparisons of the NGA ground-motion relations
Abrahamson, N.; Atkinson, G.; Boore, D.; Bozorgnia, Y.; Campbell, K.; Chiou, B.; Idriss, I.M.; Silva, W.; Young, S.R.
2008-01-01
The data sets, model parameterizations, and results from the five NGA models for shallow crustal earthquakes in active tectonic regions are compared. A key difference in the data sets is the inclusion or exclusion of aftershocks. A comparison of the median spectral values for strike-slip earthquakes shows that they are within a factor of 1.5 for magnitudes between 6.0 and 7.0 for distances less than 100 km. The differences increase to a factor of 2 for M5 and M8 earthquakes, for buried ruptures, and for distances greater than 100 km. For soil sites, the differences in the modeling of soil/sediment depth effects increase the range in the median long-period spectral values for M7 strike-slip earthquakes to a factor of 3. The five models have similar standard deviations for M6.5-M7.5 earthquakes for rock sites and for soil sites at distances greater than 50 km. Differences in the standard deviations of up to 0.2 natural log units for moderate magnitudes at all distances and for large magnitudes at short distances result from the treatment of the magnitude dependence and the effects of nonlinear site response on the standard deviation. ?? 2008, Earthquake Engineering Research Institute.
Wald, D.; Lin, K.-W.; Porter, K.; Turner, Loren
2008-01-01
When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, on its own can be useful for emergency response, loss estimation, and public information. However, to take full advantage of the potential of ShakeMap, we introduce ShakeCast. ShakeCast facilitates the complicated assessment of potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps of structures or facilities most likely impacted. ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users' facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for both public and private emergency managers and responders. ?? 2008, Earthquake Engineering Research Institute.
Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)
NASA Astrophysics Data System (ADS)
Applegate, D.
2010-12-01
This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation of earthquake scientists and engineers. In addition to the national maps, the USGS produces more detailed urban seismic hazard maps that communities have used to prioritize retrofits and design critical infrastructure that can withstand large earthquakes. At a regional scale, the USGS and its partners in California have developed a time-dependent earthquake rupture forecast that is being used by the insurance sector, which can serve to distribute risk and foster mitigation if the right incentives are in place. What the USGS and partners are doing at the urban, regional, and national scales, the Global Earthquake Model project is seeking to do for the world. A significant challenge for engaging the public to prepare for earthquakes is making low-probability, high-consequence events real enough to merit personal action. Scenarios help by starting with the hazard posed by a specific earthquake and then exploring the fragility of the built environment, cascading failures, and the real-life consequences for the public. To generate such a complete picture takes multiple disciplines working together. Earthquake scenarios are being used both for emergency management exercises and much broader public preparedness efforts like the Great California ShakeOut, which engaged nearly 7 million people.
Building Models to Better Understand the Importance of Cost versus Safety in Engineering
ERIC Educational Resources Information Center
Sumrall, William; Mott, Michael
2010-01-01
While some disasters involving engineered structures are due to events in nature (e.g., tornadoes, hurricanes, earthquakes), others may be caused by inadequate materials, design flaws, and poor maintenance. These catastrophes result in the loss of human lives and cost billions of dollars. In the set of lessons described here, students design a…
Events | Pacific Earthquake Engineering Research Center
home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links Events Calendar of PEER and Other Events PEER Events Archive PEER Annual Meeting 2009 Experimental Structural Engineering PEER Summative Meeting Site Map Search Calendar of PEER and Other Events
NASA Astrophysics Data System (ADS)
Pearson, J. K.; Noriega, G.; Benthien, M. L.
2017-12-01
The Undergraduate Studies in Earthquake Information Technology (USEIT) is an REU Internship Program focused in multi-disciplinary, collaborative research offered through the Southern California Earthquake Center (SCEC); a research consortium focused on earthquake science. USEIT is an 8-week intensive undergraduate research program. The program is designed for interns to work as a collaborative engine to solve an overarching real-world earthquake problem referred to as the "Grand Challenge". The interns are organized in teams and paired with mentors that have expertise in their specific task in the Grand Challenge. The program is focused around earthquake system science, where students have the opportunity to use super computers, programming platforms, geographic information systems, and internally designed and developed visualization software. The goal of the USEIT program is to motivate undergraduates from diverse backgrounds towards careers in science and engineering through team-based research in the field of earthquake information technology. Efforts are made to recruit students with diverse backgrounds, taking into consideration gender, ethnic background, socioeconomic standing, major, college year, and institution type (2-year and 4-year colleges). USEIT has a partnership with two local community colleges to recruit underserved students. Our emphasis is to attract students that would 1) grow and develop technical skills, soft skills, and confidence from the program, and 2) provide perspective and innovation to the program. USEIT offers on-campus housing to provide a submerged learning environment, recruits diverse majors to foster interdisciplinary collaboration, maintains a full time in lab mentor for day-to-day intern needs, takes students on field trips to provide context to their research, and plans activities and field trips for team building and morale. Each year metrics are collected through exit surveys, personal statements, and intern experience statements. We highlight lessons learned, including a need for pre-program engagement to ensure student success.
Geomodels of coseismic landslides environments in Central Chile.
NASA Astrophysics Data System (ADS)
Serey, A.; Sepulveda, S. A.; Murphy, W.; Petley, D. N.
2017-12-01
Landslides are a major source of fatalities and damage during strong earthquakes in mountain areas. Detailed geomodels of coseismic landslides environments are essential parts of seismic landslide hazard analyses. The development of a site specific geological model is required, based on consideration of the regional and local geological and geomorphological history and the current ground surface conditions. An engineering geological model is any approximation of the geological conditions, at varying scales, created for the purpose of solving an engineering problem. In our case, the objective is the development of a methodology for earthquake-induced landslide hazard assessment applicable to urban/territorial planning and disaster prevention strategies assessment at a regional scale adapted for the Chilean tectonic conditions. We have developed the only 2 complete inventories of landslides triggered by earthquakes in Chile. The first from the Mw 6.2, shallow crustal Aysén earthquake in 2007. Second one from the Mw 8.8, megathrust subduction Maule earthquake in 2010. From the comparison of these 2 inventories with others from abroad, as well as analysis of large, prehistoric landslide inventories proposed as likely induced by seismic activity we have determined topographic, geomorphological, geological and seismic controlling factors in the occurrence of earthquake-triggered landslides. With the information collected we have defined different environments for generation of coseismic landslides based on the construction of geomodels. As a result we have built several geomodels in the Santiago Cordillera in central Chile (33°S), based upon the San Ramón Fault, a west-vergent reverse fault that outcrops at the edge of Santiago basin recently found to be active and a likely source of seismic activity in the future, with potential of triggering landslides in the Santiago mountain front as well as inland into the Mapocho and Maipo Cordilleran valleys. In conclusion these geomodels are a powerful tool for earthquake-induced landslide hazard assessment. As an implication we can identify landslide-prone areas, distinguish different seismic scenarios and describe related potential hazards, including burial and river damming by large rock slides and rock avalanches.
NASA Astrophysics Data System (ADS)
Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.
2014-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.
Multidisciplinary Geo-scientific Hazard Analyses: Istanbul Microzonation Projects
NASA Astrophysics Data System (ADS)
Kara, Sema; Baş, Mahmut; Kılıç, Osman; Tarih, Ahmet; Yahya Menteşe, Emin; Duran, Kemal
2017-04-01
Istanbul (Turkey) is located on the west edge of North Anatolia Fault and hence is an earthquake prone city with a population that exceeds 15 million people. In addition, the city is still growing as center of commerce, tourism and culture that increases the exposure more and more. During the last decade, although Istanbul grew faster than ever in its history, precautions against a possible earthquake have also increased steadily. The two big earthquakes (in Kocaeli and Duzce Provinces) occurred in 1999 alongside Istanbul and these events became the trigger events that accelerated the disaster risk reduction activities in Istanbul. Following a loss estimation study carried out by Japanese International Cooperation Agency (JICA) in 2001 and Istanbul Earthquake Master Plan prepared by four major universities' researchers in 2003; it was evaluated that understanding and analyzing the geological structure in Istanbul was the main concern. Thereafter Istanbul Metropolitan Municipality's Directorate of Earthquake and Ground Research (DEGRE) carried out two major geo-scientific studies called "microzonation studies" covering 650 km2 of Istanbul's urbanized areas between 2006 and 2009. The studies were called "microzonation" because the analysis resolution was as dense as 250m grids and included various assessments on hazards such as ground shaking, liquefaction, karstification, landslide, flooding, and surface faulting. After the evaluation of geological, geotechnical and geophysical measurements; Earthquake and Tsunami Hazard Maps for all Istanbul, slope, engineering geology, ground water level, faulting, ground shaking, inundation, shear wave velocity and soil classification maps for the project areas were obtained. In the end "Land Suitability Maps" are derived from the combination of inputs using multi-hazard approach. As a result, microzonation is tool for risk oriented urban planning; consisting of interdisciplinary multi-hazard risk analyses. The outputs of microzonation are used in land development/use plans, hazard identification in urban transformation, determination of the routes and characteristics of various types of engineering structures such as highways, tunnels, bridges, railroads, viaducts and ports. Hence, by the use of detailed geo-scientific analyses, basics of earthquake resilient urbanization is guaranteed.
billion in direct damage. The event spurred important changes to the current practice of earthquake engineering and risk mitigation worldwide, including changes to building codes for steel structures and multi
76 FR 42750 - National Science Board: Sunshine Act Meetings; Notice
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-19
...) Update NSB Information Item: Network for Earthquake Engineering Simulation (NEES) Update NSB Information... Teleconference Discussion on the Timeline, Process and Procedures for Evaluating Nominees Update on Committee...
NASA Astrophysics Data System (ADS)
So, E.
2010-12-01
Earthquake casualty loss estimation, which depends primarily on building-specific casualty rates, has long suffered from a lack of cross-disciplinary collaboration in post-earthquake data gathering. An increase in our understanding of what contributes to casualties in earthquakes involve coordinated data-gathering efforts amongst disciplines; these are essential for improved global casualty estimation models. It is evident from examining past casualty loss models and reviewing field data collected from recent events, that generalized casualty rates cannot be applied globally for different building types, even within individual countries. For a particular structure type, regional and topographic building design effects, combined with variable material and workmanship quality all contribute to this multi-variant outcome. In addition, social factors affect building-specific casualty rates, including social status and education levels, and human behaviors in general, in that they modify egress and survivability rates. Without considering complex physical pathways, loss models purely based on historic casualty data, or even worse, rates derived from other countries, will be of very limited value. What’s more, as the world’s population, housing stock, and living and cultural environments change, methods of loss modeling must accommodate these variables, especially when considering casualties. To truly take advantage of observed earthquake losses, not only do damage surveys need better coordination of international and national reconnaissance teams, but these teams must integrate difference areas of expertise including engineering, public health and medicine. Research is needed to find methods to achieve consistent and practical ways of collecting and modeling casualties in earthquakes. International collaboration will also be necessary to transfer such expertise and resources to the communities in the cities which most need it. Coupling the theories and findings from the field surveys with experiments would also be advantageous as it is not always be possible to validate theories and models with actual earthquake data. In addition, colleagues in other disciplines will benefit from being introduced to the loss algorithms, methodologies and advances familiar to the engineering community, to help dissemination in earthquake mitigation and preparedness programs. It follows that new approaches to loss estimation must include a progressive assessment of what contributes to the final casualty value. In analyzing recent earthquakes, testing common hypotheses, talking to local and international researchers in the field, interviewing search and rescue and medical personnel, and comparing notes with colleagues who have visited other events, the author has developed a list of contributory factors to formulate fatality rates for use in earthquake loss estimation models. In this presentation, we will first look at the current state of data collection and assessment in casualty loss estimation. Then, the analyses of recent earthquake field data, which provide important insights to the contributory factors of fatalities in earthquakes, will be explored. The benefits of a multi-disciplinary approach in deriving fatality rates for masonry buildings will then be examined in detail.
GIS-based seismic shaking slope vulnerability map of Sicily (Central Mediterranean)
NASA Astrophysics Data System (ADS)
Nigro, Fabrizio; Arisco, Giuseppe; Perricone, Marcella; Renda, Pietro; Favara, Rocco
2010-05-01
Earthquakes often represent very dangerouses natural events in terms of human life and economic losses and their damage effects are amplified by the synchronous occurrence of seismically-induced ground-shaking failures in wide regions around the seismogenic source. In fact, the shaking associated with big earthquakes triggers extensive landsliding, sometimes at distances of more than 100 km from the epicenter. The active tectonics and the geomorphic/morphodinamic pattern of the regions affected by earthquakes contribute to the slopes instability tendency. In fact, earthquake-induced groun-motion loading determines inertial forces activation within slopes that, combined with the intrinsic pre-existing static forces, reduces the slope stability towards its failure. Basically, under zero-shear stress reversals conditions, a catastrophic failure will take place if the earthquake-induced shear displacement exceeds the critical level of undrained shear strength to a value equal to the gravitational shear stress. However, seismic stability analyses carried out for various infinite slopes by using the existing Newmark-like methods reveal that estimated permanent displacements smaller than the critical value should also be regarded as dangerous for the post-earthquake slope safety, in terms of human activities use. Earthquake-induced (often high-speed) landslides are among the most destructive phenomena related to slopes failure during earthquakes. In fact, damage from earthquake-induced landslides (and other ground-failures), sometimes exceeds the buildings/infrastructures damage directly related to ground-shaking for fault breaking. For this matter, several hearthquakes-related slope failures methods have been developed, for the evaluation of the combined hazard types represented by seismically ground-motion landslides. The methodologies of analysis of the engineering seismic risk related to the slopes instability processes is often achieved through the evaluation of the permanent displacement potentially induced by an seismic scenario. Such methodologies found on the consideration that the conditions of seismic stability and the post-seismic functionality of engineering structures are tightly related to the entity of the permanent deformations that an earthquake can induce. Regarding the existing simplified procedures among slope stability models, Newmark's model is often used to derive indications about slope instabilities due to earthquakes. In this way, we have evaluated the seismically-induced landslides hazard in Sicily (Central Mediterranean) using the Newmark-like model. In order to determine the map distribution of the seismic ground-acceleration from an earthquake scenario, the attenuation-law of Sabetta & Pugliese has been used, analyzing some seismic recordings occurred in Italy. Also, by evaluating permanent displacements, the correlation of Ambraseys & Menu has been assumed. The seismic shaking slope vulnerability map of Sicily has been carried out using GIS application, also considering max seismic ground-acceleration peak distribution (in terms of exceedance probability for fixed time), slope acclivity, cohesion/angle of internal friction of outcropping rocks, allowing the zoning of the unstable slopes under seismic forces.
NASA Astrophysics Data System (ADS)
Bostenaru Dan, M.
2009-04-01
This special issue includes selected papers on the topic of earthquake impact from the sessions held in 2004 in Nice, France and in 2005 in Vienna, Austria at the first and respectivelly the second European Geosciences Union General Assembly. Since its start in 1999, in the Hague, Netherlands, the hazard of earthquakes has been the most popular of the session. The respective calls in 2004 was for: Nature's forces including earthquakes, floods, landslides, high winds and volcanic eruptions can inflict losses to urban settlements and man-made structures such as infrastructure. In Europe, recent years have seen such significant losses from earthquakes in south and south-eastern Europe, floods in central Europe, and wind storms in western Europe. Meanwhile, significant progress has been made in understanding disasters. Several scientific fields contribute to a holistic approach in the evaluation of capacities, vulnerabilities and hazards, the main factors on mitigating urban disasters due to natural hazards. An important part of the session is devoted to assessment of earthquake shaking and loss scenarios, including both physical damage and human causalities. Early warning and rapid damage evaluation are of utmost importance for addressing the safety of many essential facilities, for emergency management of events and for disaster response. In case of earthquake occurrence strong motion networks, data processing and interpretation lead to preliminary estimation (scenarios) of geographical distribution of damages. Factual information on inflicted damage, like those obtained from shaking maps or aerial imagery permit a confrontation with simulation maps of damage in order to define a more accurate picture of the overall losses. Most recent developments towards quantitative and qualitative simulation of natural hazard impacts on urban areas, which provide decision-making support for urban disaster management, and success stories of and lessons learned from disaster mitigation will be presented. The session includes contributions showing methodological and modelling approaches from scientists in geophysical/seismological, hydrological, remote sensing, civil engineering, insurance, and urbanism, amongst other fields, as well as presentations from practitioners working on specific case studies, regarding analysis of recent events and their impact on cities as well as re-evaluation of past events from the point of view of long-time recovery. In 2005 it was called for: Most strategies for both preparedness and emergency management in case of disaster mitigation are related to urban planning. While natural, engineering and social sciences contribute to the evaluation of the impact of earthquakes and their secondary events (including tsunamis, earthquake triggered landslides, or fire), floods, landslides, high winds, and volcanic eruptions on urban areas, there are the instruments of urban planning which are to be employed for both visualisation as well as development and implementation of strategy concepts for pre- and postdisaster intervention. The evolution of natural systems towards extreme conditions is taken into consideration so far at it concerns the damaging impact on urban areas and infrastructure and the impact on the natural environment of interventions to reduce such damaging impact.
Uncertainties in evaluation of hazard and seismic risk
NASA Astrophysics Data System (ADS)
Marmureanu, Gheorghe; Marmureanu, Alexandru; Ortanza Cioflan, Carmen; Manea, Elena-Florinela
2015-04-01
Two methods are commonly used for seismic hazard assessment: probabilistic (PSHA) and deterministic(DSHA) seismic hazard analysis.Selection of a ground motion for engineering design requires a clear understanding of seismic hazard and risk among stakeholders, seismologists and engineers. What is wrong with traditional PSHA or DSHA ? PSHA common used in engineering is using four assumptions developed by Cornell in 1968:(1)-Constant-in-time average occurrence rate of earthquakes; (2)-Single point source; (3).Variability of ground motion at a site is independent;(4)-Poisson(or "memory - less") behavior of earthquake occurrences. It is a probabilistic method and "when the causality dies, its place is taken by probability, prestigious term meant to define the inability of us to predict the course of nature"(Nils Bohr). DSHA method was used for the original design of Fukushima Daichii, but Japanese authorities moved to probabilistic assessment methods and the probability of exceeding of the design basis acceleration was expected to be 10-4-10-6 . It was exceeded and it was a violation of the principles of deterministic hazard analysis (ignoring historical events)(Klügel,J,U, EGU,2014, ISSO). PSHA was developed from mathematical statistics and is not based on earthquake science(invalid physical models- point source and Poisson distribution; invalid mathematics; misinterpretation of annual probability of exceeding or return period etc.) and become a pure numerical "creation" (Wang, PAGEOPH.168(2011),11-25). An uncertainty which is a key component for seismic hazard assessment including both PSHA and DSHA is the ground motion attenuation relationship or the so-called ground motion prediction equation (GMPE) which describes a relationship between a ground motion parameter (i.e., PGA,MMI etc.), earthquake magnitude M, source to site distance R, and an uncertainty. So far, no one is taking into consideration strong nonlinear behavior of soils during of strong earthquakes. But, how many cities, villages, metropolitan areas etc. in seismic regions are constructed on rock? Most of them are located on soil deposits? A soil is of basic type sand or gravel (termed coarse soils), silt or clay (termed fine soils) etc. The effect on nonlinearity is very large. For example, if we maintain the same spectral amplification factor (SAF=5.8942) as for relatively strong earthquake on May 3,1990(MW=6.4),then at Bacǎu seismic station for Vrancea earthquake on May 30,1990 (MW =6.9) the peak acceleration has to be a*max =0.154g and the actual recorded was only, amax =0.135g(-14.16%). Also, for Vrancea earthquake on August 30,1986(MW=7.1),the peak acceleration has to be a*max = 0.107g instead of real value recorded of 0.0736 g(- 45.57%). There are many data for more than 60 seismic stations. There is a strong nonlinear dependence of SAF with earthquake magnitude in each site. The authors are coming with an alternative approach called "real spectral amplification factors" instead of GMPE for all extra-Carpathian area where all cities and villages are located on soil deposits. Key words: Probabilistic Seismic Hazard; Uncertainties; Nonlinear seismology; Spectral amplification factors(SAF).
1992-06-01
Rodolfo H. eu al., December 1985. Analisis de los Acelero- gramas del Terremoto del 3 de Marzo de 1985: University of Chile, Pub- lication SES I 4/1985 (199...196741975 Records: Open-File Report (unpublished). Mexico 1974 Prince, Jorge at al., February 1976 . Procesamiento de Acelerograas Obtenidos on 1974:, UNAM...engineering profession. The recent Mexican Guerrero data is a welcome exception to this generalization. 9 Calculations 24 . Few calculations were required for
The Scientific Contribution by Arturo Danusso after the 1908 Messina-Reggio Calabria Earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sorrentino, Luigi
2008-07-08
After 1908 Messina and Reggio Calabria earthquake the Italian scientific community produced a considerable effort, also as a consequence of two competitions. In both contests a Piedmont engineer, Arturo Danusso, prevailed. In this paper emphasis will be put on the suggested building layout and construction details, which are far ahead of their time. Moreover, the dynamic analysis of a linear elastic undamped single- and two-degree-of freedom system, proposed by Danusso and elsewhere already discussed, will be briefly mentioned.
Seismic hazard, risk, and design for South America
Petersen, Mark D.; Harmsen, Stephen; Jaiswal, Kishor; Rukstales, Kenneth S.; Luco, Nicolas; Haller, Kathleen; Mueller, Charles; Shumway, Allison
2018-01-01
We calculate seismic hazard, risk, and design criteria across South America using the latest data, models, and methods to support public officials, scientists, and engineers in earthquake risk mitigation efforts. Updated continental scale seismic hazard models are based on a new seismicity catalog, seismicity rate models, evaluation of earthquake sizes, fault geometry and rate parameters, and ground‐motion models. Resulting probabilistic seismic hazard maps show peak ground acceleration, modified Mercalli intensity, and spectral accelerations at 0.2 and 1 s periods for 2%, 10%, and 50% probabilities of exceedance in 50 yrs. Ground shaking soil amplification at each site is calculated by considering uniform soil that is applied in modern building codes or by applying site‐specific factors based on VS30">VS30 shear‐wave velocities determined through a simple topographic proxy technique. We use these hazard models in conjunction with the Prompt Assessment of Global Earthquakes for Response (PAGER) model to calculate economic and casualty risk. Risk is computed by incorporating the new hazard values amplified by soil, PAGER fragility/vulnerability equations, and LandScan 2012 estimates of population exposure. We also calculate building design values using the guidelines established in the building code provisions. Resulting hazard and associated risk is high along the northern and western coasts of South America, reaching damaging levels of ground shaking in Chile, western Argentina, western Bolivia, Peru, Ecuador, Colombia, Venezuela, and in localized areas distributed across the rest of the continent where historical earthquakes have occurred. Constructing buildings and other structures to account for strong shaking in these regions of high hazard and risk should mitigate losses and reduce casualties from effects of future earthquake strong ground shaking. National models should be developed by scientists and engineers in each country using the best available science.
A reliable simultaneous representation of seismic hazard and of ground shaking recurrence
NASA Astrophysics Data System (ADS)
Peresan, A.; Panza, G. F.; Magrin, A.; Vaccari, F.
2015-12-01
Different earthquake hazard maps may be appropriate for different purposes - such as emergency management, insurance and engineering design. Accounting for the lower occurrence rate of larger sporadic earthquakes may allow to formulate cost-effective policies in some specific applications, provided that statistically sound recurrence estimates are used, which is not typically the case of PSHA (Probabilistic Seismic Hazard Assessment). We illustrate the procedure to associate the expected ground motions from Neo-deterministic Seismic Hazard Assessment (NDSHA) to an estimate of their recurrence. Neo-deterministic refers to a scenario-based approach, which allows for the construction of a broad range of earthquake scenarios via full waveforms modeling. From the synthetic seismograms the estimates of peak ground acceleration, velocity and displacement, or any other parameter relevant to seismic engineering, can be extracted. NDSHA, in its standard form, defines the hazard computed from a wide set of scenario earthquakes (including the largest deterministically or historically defined credible earthquake, MCE) and it does not supply the frequency of occurrence of the expected ground shaking. A recent enhanced variant of NDSHA that reliably accounts for recurrence has been developed and it is applied to the Italian territory. The characterization of the frequency-magnitude relation can be performed by any statistically sound method supported by data (e.g. multi-scale seismicity model), so that a recurrence estimate is associated to each of the pertinent sources. In this way a standard NDSHA map of ground shaking is obtained simultaneously with the map of the corresponding recurrences. The introduction of recurrence estimates in NDSHA naturally allows for the generation of ground shaking maps at specified return periods. This permits a straightforward comparison between NDSHA and PSHA maps.
NASA Astrophysics Data System (ADS)
Aloisi, Marco; Briffa, Emanuela; Cannata, Andrea; Cannavò, Flavio; Gambino, Salvatore; Maiolino, Vincenza; Maugeri, Roberto; Palano, Mimmo; Privitera, Eugenio; Scaltrito, Antonio; Spampinato, Salvatore; Ursino, Andrea; Velardita, Rosanna
2015-04-01
The seismic events caused by human engineering activities are commonly termed as "triggered" and "induced". This class of earthquakes, though characterized by low-to-moderate magnitude, have significant social and economical implications since they occur close to the engineering activity responsible for triggering/inducing them and can be felt by the inhabitants living nearby, and may even produce damage. One of the first well-documented examples of induced seismicity was observed in 1932 in Algeria, when a shallow magnitude 3.0 earthquake occurred close to the Oued Fodda Dam. By the continuous global improvement of seismic monitoring networks, numerous other examples of human-induced earthquakes have been identified. Induced earthquakes occur at shallow depths and are related to a number of human activities, such as fluid injection under high pressure (e.g. waste-water disposal in deep wells, hydrofracturing activities in enhanced geothermal systems and oil recovery, shale-gas fracking, natural and CO2 gas storage), hydrocarbon exploitation, groundwater extraction, deep underground mining, large water impoundments and underground nuclear tests. In Italy, induced/triggered seismicity is suspected to have contributed to the disaster of the Vajont dam in 1963. Despite this suspected case and the presence in the Italian territory of a large amount of engineering activities "capable" of inducing seismicity, no extensive researches on this topic have been conducted to date. Hence, in order to improve knowledge and correctly assess the potential hazard at a specific location in the future, here we started a preliminary study on the entire range of engineering activities currently located in Sicily (Southern Italy) which may "potentially" induce seismicity. To this end, we performed: • a preliminary census of all engineering activities located in the study area by collecting all the useful information coming from available on-line catalogues; • a detailed compilation of instrumental and historical seismicity, focal mechanisms solutions, multidisciplinary stress indicators, GPS-based ground deformation field, mapped faults, etc by merging data from on-line catalogues with those reported in literature. Finally, for each individual site, we analysed: i) long-term statistic behaviour of instrumental seismicity (magnitude of completeness, seismic release above a threshold magnitude, depth distribution, focal plane solutions); ii) long-term statistic behaviour of historical seismicity (maximum magnitude estimation, recurrence time interval, etc); iii) properties and orientation of faults (length, estimated geological slip, kinematics, etc); iv) regional stress (from borehole, seismological and geological observations) and strain (from GPS-based observations) fields.
NASA Astrophysics Data System (ADS)
Khalil, Amin E.; Abdel Hafiez, H. E.; Girgis, Milad; Taha, M. A.
2017-06-01
Strong ground shaking during earthquakes can greatly affect the ancient monuments and subsequently demolish the human heritage. On October 12th 1992, a moderate earthquake (Ms = 5.8) shocked the greater Cairo area causing widespread damages. Unfortunately, the focus of that earthquake is located about 14 km to the south of Zoser pyramid. After the earthquake, the Egyptian Supreme council of antiquities issued an alarm that Zoser pyramid is partially collapsed and international and national efforts are exerted to restore this important human heritage that was built about 4000 years ago. Engineering and geophysical work is thus needed for the restoration process. The definition of the strong motion parameters is one of the required studies since seismically active zone is recorded in its near vicinity. The present study adopted the stochastic method to determine the peak ground motion (acceleration, velocity and displacement) for the three largest earthquakes recorded in the Egypt's seismological history. These earthquakes are Shedwan earthquake with magnitude Ms = 6.9, Aqaba earthquake with magnitude Mw = 7.2 and Cairo (Dahshour earthquake) with magnitude Ms = 5.8. The former two major earthquakes took place few hundred kilometers away. It is logic to have the predominant effects from the epicentral location of the Cairo earthquake; however, the authors wanted to test also the long period effects of the large distance earthquakes expected from the other two earthquakes under consideration. In addition, the dynamic site response was studied using the Horizontal to vertical spectral ratio (HVSR) technique. HVSR can provide information about the fundamental frequency successfully; however, the amplification estimation is not accepted. The result represented as either peak ground motion parameters or response spectra indicates that the effects from Cairo earthquake epicenter are the largest for all periods considered in the present study. The level of strong motion as indicated by peak ground acceleration reaches the value of 250 gals which is considerably high. At the end, it is worth to mention that the information resulted from the present work may be useful for the planned restoration decision of the Zoser pyramid site.
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links bip members PEER Business and
Documentation for the 2008 Update of the United States National Seismic Hazard Maps
Petersen, Mark D.; Frankel, Arthur D.; Harmsen, Stephen C.; Mueller, Charles S.; Haller, Kathleen M.; Wheeler, Russell L.; Wesson, Robert L.; Zeng, Yuehua; Boyd, Oliver S.; Perkins, David M.; Luco, Nicolas; Field, Edward H.; Wills, Chris J.; Rukstales, Kenneth S.
2008-01-01
The 2008 U.S. Geological Survey (USGS) National Seismic Hazard Maps display earthquake ground motions for various probability levels across the United States and are applied in seismic provisions of building codes, insurance rate structures, risk assessments, and other public policy. This update of the maps incorporates new findings on earthquake ground shaking, faults, seismicity, and geodesy. The resulting maps are derived from seismic hazard curves calculated on a grid of sites across the United States that describe the frequency of exceeding a set of ground motions. The USGS National Seismic Hazard Mapping Project developed these maps by incorporating information on potential earthquakes and associated ground shaking obtained from interaction in science and engineering workshops involving hundreds of participants, review by several science organizations and State surveys, and advice from two expert panels. The National Seismic Hazard Maps represent our assessment of the 'best available science' in earthquake hazards estimation for the United States (maps of Alaska and Hawaii as well as further information on hazard across the United States are available on our Web site at http://earthquake.usgs.gov/research/hazmaps/).
Seismic Velocity Measurements at Expanded Seismic Network Sites
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woolery, Edward W; Wang, Zhenming
2005-01-01
Structures at the Paducah Gaseous Diffusion Plant (PGDP), as well as at other locations in the northern Jackson Purchase of western Kentucky may be subjected to large far-field earthquake ground motions from the New Madrid seismic zone, as well as those from small and moderate-sized local events. The resultant ground motion a particular structure is exposed from such event will be a consequence of the earthquake magnitude, the structures' proximity to the event, and the dynamic and geometrical characteristics of the thick soils upon which they are, of necessity, constructed. This investigation evaluated the latter. Downhole and surface (i.e., refractionmore » and reflection) seismic velocity data were collected at the Kentucky Seismic and Strong-Motion Network expansion sites in the vicinity of the Paducah Gaseous Diffusion Plant (PGDP) to define the dynamic properties of the deep sediment overburden that can produce modifying effects on earthquake waves. These effects are manifested as modifications of the earthquake waves' amplitude, frequency, and duration. Each of these three ground motion manifestations is also fundamental to the assessment of secondary earthquake engineering hazards such as liquefaction.« less
Earthquake Damage Assessment Using Very High Resolution Satelliteimagery
NASA Astrophysics Data System (ADS)
Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.
Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.
ShakeMap manual: technical manual, user's guide, and software guide
Wald, David J.; Worden, Bruce C.; Quitoriano, Vincent; Pankow, Kris L.
2005-01-01
ShakeMap (http://earthquake.usgs.gov/shakemap) --rapidly, automatically generated shaking and intensity maps--combines instrumental measurements of shaking with information about local geology and earthquake location and magnitude to estimate shaking variations throughout a geographic area. The results are rapidly available via the Web through a variety of map formats, including Geographic Information System (GIS) coverages. These maps have become a valuable tool for emergency response, public information, loss estimation, earthquake planning, and post-earthquake engineering and scientific analyses. With the adoption of ShakeMap as a standard tool for a wide array of users and uses came an impressive demand for up-to-date technical documentation and more general guidelines for users and software developers. This manual is meant to address this need. ShakeMap, and associated Web and data products, are rapidly evolving as new advances in communications, earthquake science, and user needs drive improvements. As such, this documentation is organic in nature. We will make every effort to keep it current, but undoubtedly necessary changes in operational systems take precedence over producing and making documentation publishable.
NASA Astrophysics Data System (ADS)
Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.
2008-12-01
In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.
POST Earthquake Debris Management — AN Overview
NASA Astrophysics Data System (ADS)
Sarkar, Raju
Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction and demolition debris following an earthquake.
POST Earthquake Debris Management - AN Overview
NASA Astrophysics Data System (ADS)
Sarkar, Raju
Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction and demolition debris following an earthquake.
NASA Astrophysics Data System (ADS)
Kasahara, K.; Nakagawa, S.; Sakai, S.; Nanjo, K.; Panayotopoulos, Y.; Morita, Y.; Tsuruoka, H.; Kurashimo, E.; Obara, K.; Hirata, N.; Aketagawa, T.; Kimura, H.
2011-12-01
On April 2007, we have launched the special project for earthquake disaster mitigation in the Tokyo Metropolitan area (Fiscal 2007-2011). As a part of this project, construction of the MeSO-net (Metropolitan Seismic Observation network) has been completed, with about 300 stations deployed at mainly elementary and junior-high schools with an interval of about 5 km in space. This results in a highly dense network that covers the metropolitan area. To achieve stable seismic observation with lower surface ground noise, relative to a measurement on the surface, sensors of all stations were installed in boreholes at a depth of about 20m. The sensors have a wide dynamic range (135dB) and a wide frequency band (DC to 80Hz). Data are digitized with 200Hz sampling and telemetered to the Earthquake Research Institute, University of Tokyo. The MeSO-net that can detect and locate most earthquakes with magnitudes above 2.5 provides a unique baseline in scientific and engineering researches on the Tokyo metropolitan area, as follows. One of the main contributions is to greatly improve the image of the Philippine Sea plate (PSP) (Nakagawa et al., 2010) and provides an accurate estimation of the plate boundaries between the PSP and the Pacific plate, allowing us to possibly discuss clear understanding of the relation between the PSP deformation and M7+ intra-slab earthquake generation. Also, the latest version of the plate model in the metropolitan area, proposed by our project, attracts various researchers, comparing with highly-accurate solutions of fault mechanism, repeating earthquakes, etc. Moreover, long-periods ground motions generated by the 2011 earthquake off the Pacific coast of Tohoku earthquake (Mw 9.0) were observed by the MeSO-net and analyzed to obtain the Array Back-Projection Imaging of this event (Honda et al., 2011). As a result, the overall pattern of the imaged asperities coincides well with the slip distribution determined based on other waveform inversion methods. The data contributes to solve co-called "the problem of the Long-Period Ground Motion Hazard", an engineering problem about earthquake disaster prevention of city (Koketsu et al., 2008). Moreover we could understand the detailed distributions of dominant periods of H/V spectral ratios and ground responses excited in the metropolitan area (Tsuno et al., 2011). The overall results obtained under our project will contribute directly to the next assessment of the seismic hazard in the Tokyo metropolitan area.
NASA Astrophysics Data System (ADS)
Craifaleanu, Iolanda-Gabriela; Georgescu, Emil-Sever; Dragomir, Claudiu-Sorin
2016-10-01
Almost four decades after the MG-R = 7.2 (Mw =7.4) catastrophic earthquake of March 4, 1977 hit Romania, the population fears a new strong earthquake; however, awareness on preparedness and mitigation measures is rather low. As the last Mw > 6 has occurred in 1990, there is an increasing percentage of young population that has not yet witnessed a strong earthquake, and which has a rather fuzzy representation of urban and geological earthquake effects. After each strong seismic event in the past, due to its specific attributions, the National Institute for Building Research, INCERC, collected a considerable amount of information about the earthquake effects on built environment and lifelines, geological effects etc. To this, information from various documentary sources about damage caused by historic earthquakes was added by the institute's specialists. Stored today in the archives of the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development, “URBAN-INCERC”, INCERC Bucharest Branch, this information is invaluable today for evaluating the present and future seismic risk of the country. Nonetheless, it could represent an essential educational resource for university students and young professionals in the field of civil engineering, seismology, geology, economy, sociology, history etc. and for raising population awareness on seismic risk mitigation measures. The paper presents new approaches for the dissemination and re-valuation of the March 4, 1977 earthquake data, from the perspective of present scientific knowledge.
Directivity in NGA earthquake ground motions: Analysis using isochrone theory
Spudich, P.; Chiou, B.S.J.
2008-01-01
We present correction factors that may be applied to the ground motion prediction relations of Abrahamson and Silva, Boore and Atkinson, Campbell and Bozorgnia, and Chiou and Youngs (all in this volume) to model the azimuthally varying distribution of the GMRotI50 component of ground motion (commonly called 'directivity') around earthquakes. Our correction factors may be used for planar or nonplanar faults having any dip or slip rake (faulting mechanism). Our correction factors predict directivity-induced variations of spectral acceleration that are roughly half of the strike-slip variations predicted by Somerville et al. (1997), and use of our factors reduces record-to-record sigma by about 2-20% at 5 sec or greater period. ?? 2008, Earthquake Engineering Research Institute.
Geological Investigation and analysis in response to Earthquake Induced Landslide in West Sumatra
NASA Astrophysics Data System (ADS)
Karnawati, D.; Wilopo, W.; Salahudin, S.; Sudarno, I.; Burton, P.
2009-12-01
Substantial socio-economical loss occurred in response to the September 30. 2009 West Sumatra Earthquake with magnitude of 7.6. Damage of houses and engineered structures mostly occurred at the low land of alluvium sediments due to the ground amplification, whilst at the high land of mountain slopes several villages were buried by massive debris of rocks and soils. It was recorded that 1115 people died due to this disasters. Series of geological investigation was carried out by Geological Engineering Department of Gadjah Mada University, with the purpose to support the rehabilitation program. Based on this preliminary investigation it was identified that most of the house and engineered structural damages at the alluvial deposits mainly due to by the poor quality of such houses and engineered structures, which poorly resist the ground amplification, instead of due to the control of geological conditions. On the other hand, the existence and distribution of structural geology (faults and joints) at the mountaineous regions are significant in controlling the distribution of landslides, with the types of rock falls, debris flows and debris falls. Despite the landslide susceptibility mapping conducted by Geological Survey of Indonesia, more detailed investigation is required to be carried out in the region surrounding Maninjau Lake, in order to provide safer places for village relocation. Accordingly Gadjah Mada University in collaboration with the local university (Andalas University) as well as with the local Government of Agam Regency and the Geological Survey of Indonesia, serve the mission for conducting rather more detailed geological and landslide investigation. It is also crucial that the investigation (survey and mapping) on the social perception and expectation of local people living in this landslide susceptible area should also be carried out, to support the mitigation effort of any future potential earthquake induced landslides.
Seismic Barrier Protection of Critical Infrastructure
2017-05-14
where collapsing buildings claim by far most lives. Moreover, in recent events, industry activity of oil extraction and wastewater reinjection are...engineering building structural designs and materials have evolved over many years to minimize the destructive effects of seismic surface waves. However...Rayleigh, Love, shear). To protect against them, a large body of earthquake engineering has been developed, and effective building practices are
Geoengineering and seismological aspects of the Niigata-Ken Chuetsu-Oki earthquake of 16 July 2007
Kayen, R.; Brandenberg, S.J.; CoIlins, B.D.; Dickenson, S.; Ashford, S.; Kawamata, Y.; Tanaka, Y.; Koumoto, H.; Abrahamson, N.; Cluff, L.; Tokimatsu, K.
2009-01-01
The M6.6 Niigata-Ken Chuetsu-Oki earthquake of 16 July 2007 occurred off the west coast of Japan with a focal depth of 10 km, immediately west of Kashiwazaki City and Kariwa Village in southern Niigata Prefecture. Peak horizontal ground accelerations of 0.68 g were measured in Kashiwazaki City, as well as at the reactor floor level of the world's largest nuclear reactor, located on the coast at Kariwa Village. Liquefaction of historic and modern river deposits, aeolian dune sand, and manmade fill was widespread in the coastal region nearest the epicenter and caused ground deformations that damaged bridges, embankments, roadways, buildings, ports, railways and utilities. Landslides along the coast of southern Niigata Prefecture and in mountainous regions inland of Kashiwazaki were also widespread affecting transportation infrastructure. Liquefaction and a landslide also damaged the nuclear power plant sites. This paper, along with a companion digital map database available at http://walrus.wr.usgs.gOv/infobank/n/nii07jp/html/n-ii-07-jp.sites.kmz, describes the seismological and geo-engineering aspects of the event. ?? 2009, Earthquake Engineering Research Institute.
Trends and opportunities in seismology. [Asilomar, California, January 3--9, 1976
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1977-01-01
Thirty-five experts in the fields of geology, geophysics, and engineering, from academia, government, and industry, were invited to participate in a workshop and address the many problems of national and global concern that require seismological expertise for their solutions. This report reviews the history, accomplishments, and status of seismology; assesses changing trends in seismological research and applications; and recommends future directions in the light of these changes and of the growing needs of society in areas in which seismology can make significant contributions. The first part of the volume discusses areas of opportunity (understanding earthquakes and reducing their hazards; exploration,more » energy, and resources; understanding the earth and planets) and realizing the benefits (the roles of Federal, state, and local governments, industry, and universities). The second part, Background and Progress, briefly considers each of the following topics: the birth and early growth of seismology, nuclear test monitoring and its scientific ramifications, instrumentation and data processing, geodynamics and plate tectonics, theoretical seismology, structure and composition of the earth, exploration seismology, seismic exploration for minerals, earthquake source mechanism studies, engineering seismology, strong ground motion and related earthquake hazards, volcanoes, tsunamis, planetary seismology, and international aspects of seismology. 26 figures. (RWR)« less
Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3
NASA Astrophysics Data System (ADS)
Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.
2017-12-01
Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.
Boore, D.M.; Atkinson, G.M.
2008-01-01
This paper contains ground-motion prediction equations (GMPEs) for average horizontal-component ground motions as a function of earthquake magnitude, distance from source to site, local average shear-wave velocity, and fault type. Our equations are for peak ground acceleration (PGA), peak ground velocity (PGV), and 5%-damped pseudo-absolute-acceleration spectra (PSA) at periods between 0.01 s and 10 s. They were derived by empirical regression of an extensive strong-motion database compiled by the 'PEER NGA' (Pacific Earthquake Engineering Research Center's Next Generation Attenuation) project. For periods less than 1 s, the analysis used 1,574 records from 58 mainshocks in the distance range from 0 km to 400 km (the number of available data decreased as period increased). The primary predictor variables are moment magnitude (M), closest horizontal distance to the surface projection of the fault plane (RJB), and the time-averaged shear-wave velocity from the surface to 30 m (VS30). The equations are applicable for M=5-8, RJB<200 km, and VS30= 180-1300 m/s. ?? 2008, Earthquake Engineering Research Institute.
NASA Astrophysics Data System (ADS)
Mourhatch, Ramses
This thesis examines collapse risk of tall steel braced frame buildings using rupture-to-rafters simulations due to suite of San Andreas earthquakes. Two key advancements in this work are the development of (i) a rational methodology for assigning scenario earthquake probabilities and (ii) an artificial correction-free approach to broadband ground motion simulation. The work can be divided into the following sections: earthquake source modeling, earthquake probability calculations, ground motion simulations, building response, and performance analysis. As a first step the kinematic source inversions of past earthquakes in the magnitude range of 6-8 are used to simulate 60 scenario earthquakes on the San Andreas fault. For each scenario earthquake a 30-year occurrence probability is calculated and we present a rational method to redistribute the forecast earthquake probabilities from UCERF to the simulated scenario earthquake. We illustrate the inner workings of the method through an example involving earthquakes on the San Andreas fault in southern California. Next, three-component broadband ground motion histories are computed at 636 sites in the greater Los Angeles metropolitan area by superposing short-period (0.2s-2.0s) empirical Green's function synthetics on top of long-period (> 2.0s) spectral element synthetics. We superimpose these seismograms on low-frequency seismograms, computed from kinematic source models using the spectral element method, to produce broadband seismograms. Using the ground motions at 636 sites for the 60 scenario earthquakes, 3-D nonlinear analysis of several variants of an 18-story steel braced frame building, designed for three soil types using the 1994 and 1997 Uniform Building Code provisions and subjected to these ground motions, are conducted. Model performance is classified into one of five performance levels: Immediate Occupancy, Life Safety, Collapse Prevention, Red-Tagged, and Model Collapse. The results are combined with the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.
THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People
NASA Astrophysics Data System (ADS)
Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.
2008-12-01
Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake response technologies by Los Angeles Unified School District and a top to bottom examination of Los Angeles County Fire Department's earthquake response strategies.
Loss modeling for pricing catastrophic bonds.
DOT National Transportation Integrated Search
2008-12-01
In the research, a loss estimation framework is presented that directly relates seismic : hazard to seismic response to damage and hence to losses. A Performance-Based Earthquake : Engineering (PBEE) approach towards assessing the seismic vulnerabili...
Cramer, C.H.; Kumar, A.
2003-01-01
Engineering seismoscope data collected at distances less than 300 km for the M 7.7 Bhuj, India, mainshock are compatible with ground-motion attenuation in eastern North America (ENA). The mainshock ground-motion data have been corrected to a common geological site condition using the factors of Joyner and Boore (2000) and a classification scheme of Quaternary or Tertiary sediments or rock. We then compare these data to ENA ground-motion attenuation relations. Despite uncertainties in recording method, geological site corrections, common tectonic setting, and the amount of regional seismic attenuation, the corrected Bhuj dataset agrees with the collective predictions by ENA ground-motion attenuation relations within a factor of 2. This level of agreement is within the dataset uncertainties and the normal variance for recorded earthquake ground motions.
Performance of tensor decomposition-based modal identification under nonstationary vibration
NASA Astrophysics Data System (ADS)
Friesen, P.; Sadhu, A.
2017-03-01
Health monitoring of civil engineering structures is of paramount importance when they are subjected to natural hazards or extreme climatic events like earthquake, strong wind gusts or man-made excitations. Most of the traditional modal identification methods are reliant on stationarity assumption of the vibration response and posed difficulty while analyzing nonstationary vibration (e.g. earthquake or human-induced vibration). Recently tensor decomposition based methods are emerged as powerful and yet generic blind (i.e. without requiring a knowledge of input characteristics) signal decomposition tool for structural modal identification. In this paper, a tensor decomposition based system identification method is further explored to estimate modal parameters using nonstationary vibration generated due to either earthquake or pedestrian induced excitation in a structure. The effects of lag parameters and sensor densities on tensor decomposition are studied with respect to the extent of nonstationarity of the responses characterized by the stationary duration and peak ground acceleration of the earthquake. A suite of more than 1400 earthquakes is used to investigate the performance of the proposed method under a wide variety of ground motions utilizing both complete and partial measurements of a high-rise building model. Apart from the earthquake, human-induced nonstationary vibration of a real-life pedestrian bridge is also used to verify the accuracy of the proposed method.
Bolton, Patricia A.
1993-01-01
Major earthquakes provide seismologists and engineers an opportunity to examine the performance of the Earth and the man-made structures in response to the forces of the quake. So, too, do they provide social scientists an opportunity to delve into human responses evoked by the ground shaking and its physical consequences. The findings from such research can serve to guide the development and application of programs and practices designed to reduce death, injury, property losses, and social disruption in subsequent earthquakes. This chapter contains findings from studies focused mainly on public response to the Loma Prieta earthquake; that is, on the behavior and perceptions of the general population rather than on the activities of specific organizations or on the impact on procedures or policies. A major feature of several of these studies is that the information was collected from the population throughout the Bay area, not just from persons in the most badly damaged communities or who had suffered the greatest losses. This wide range serves to provide comparisons of behavior for those most directly affected by the earthquake with others who were less directly affected by it but still had to consider it very “close to home.”
The Role of Science and Engineering in Rebuilding a More Resilient Haiti (Invited)
NASA Astrophysics Data System (ADS)
Applegate, D.
2010-12-01
Rebuilding a more disaster-resilient Haiti is the defining challenge in the wake of the devastating magnitude-7 earthquake that struck in January. The contrasting experience of Chile, which weathered a magnitude-8.8 earthquake in April with casualties in the hundreds, teaches us that building resilience is an achievable and desirable goal given suitable investments and governance. Scientists and engineers have much to contribute, but doing so requires effective mechanisms to enable them to inform the rebuilding process. The international donor community has been a key point of engagement since their funds provide the opportunity to build new schools, hospitals, critical infrastructure and housing that will not fail in the next disaster. In advance of a gathering of international donors at the end of March, the U.S. National Science and Technology Council’s interagency Subcommittee on Disaster Reduction convened a workshop that brought together over 100 scientists, engineers, planners, and policymakers, including a delegation of Haitian government officials and academics. Hosted by the University of Miami and organized by the Incorporated Research Institutions for Seismology, the workshop was co-sponsored by the U.S. Department of State, U.S. Agency for International Development (USAID), and United Nations International Strategy for Disaster Reduction with support from NASA, the National Science Foundation, and the U.S. Geological Survey (USGS). Key findings from the workshop covered the need to adopt and enforce international building codes, to use hazard assessments for earthquakes, inland flooding, and landslides in the planning process, and the central importance of long-term capacity building. As an example of one science agency’s contributions, the USGS informed the initial response by rapidly characterizing the earthquake and delivering estimates of population exposure to strong shaking that were used by humanitarian organizations, aid agencies, and the Haitians themselves. In the ensuing weeks, the USGS tracked aftershocks and issued statements with probabilities of future earthquakes. Early on, the U.S. Southern Command made it possible to put an advance team of engineers and a USGS seismologist on the ground in Haiti. That initial team was followed by the first major deployment of a USGS/USAID Earthquake Disaster Assistance Team, which evolved from the long-standing partnership between these two agencies. EDAT activities included field assessment of faulting, coastal uplift, and landslides; seismometer deployments for aftershock recording and characterization of ground shaking amplification; and development of a probabilistic seismic hazard map for Haiti and the whole island of Hispaniola. The team’s efforts benefited greatly from collaboration with Haitian colleagues with knowledge transfer occurring in both directions. The effort also benefited from significant remote sensing acquisitions, which helped to target field activities and constrain fault rupture patterns. Although the products have been put to use in Haiti, it still remains to turn hazard assessments into tools that can be used for effective planning, building code development and land-use decisions.
Proceedings of Conference XIII, evaluation of regional seismic hazards and risk
Charonnat, Barbara B.
1981-01-01
The participants in the conference concluded that a great deal of useful research has been performed in the national Earthquake Hazards Reduction Program by USGS and non-USGS scientists and engineers and that the state-of-knowledge concerning the evaluation of seismic hazards and risk has been advanced substantially. Many of the technical issues raised during the conference are less controversial now because of new information and insights gained during the first three years of the expanded research program conducted under the Earthquake Hazards Reduction Act. Utilization of research results by many groups of users has also improved during this period and further improvement in utilization appears likely. Additional research is still required to resolve more completely the many complex technical issues summarized above and described in the papers contained in the proceedings. Improved certainty of research results on the evaluation of regional seismic hazards and risk is required before full utilization can be made by state and local governments who deal. with people frequently having a different perception of the hazard and its risk to them than that perceived by scientists or engineers. Each of the papers contained in the proceedings contain throughtful recommendations for improving the state-of-knowledge. Two papers, in particular, focussed on this particular theme. The first was presented by Lynn Sykes in the Geologic Keynote Address. He identified geographic areas throughout the world which may be considered as counterparts or analogues of seismic zones in the United States. He concluded that much can be learned about prediction, tectonic settings, earthquake hazards, and earthquake risk for sites in the United States by studying their tectonic analogues in other countries. The second paper was presented by John Blume in the Engineering Keynote Address. He suggested 20 specific research topics that, in his opinion, will significantly advance the state-of-the-art in earthquakeresistant design. The papers by Sykes and Blume are presented in the front of the proceedings.
Dealing with Natural Disasters: Preparedness versus Post-Event Response
NASA Astrophysics Data System (ADS)
Sitar, N.
2015-12-01
Management or mitigation of natural disasters is comprised of two distinct elements: disaster preparedness and disaster response. Fundamentally disasters fall into two categories: 1) those whose timing can be predicted and evaluated in advance, such as hurricanes, floods, tsunamis, or even sea level rise; and 2) those that can be anticipated based on analysis, but their exact timing is unknown, such as earthquakes and landslides. Consequently, the type of response and options available for scientific and engineering consultation are fundamentally different. The common aspects of all natural disasters is that there is evidence of past events either historical or geologic, or both. Thus, given past evidence, scientists and engineers have an opportunity to recommend and guide development and implementation of long term or permanent mitigation measures, such as improving the resiliency of the infrastructure and emergency preparedness. However, the appropriate mitigation measures are very much a function of the type of event. Severe atmospheric events, such as hurricanes, typically can be predicted several days in advance and scientists and engineers have a role in guiding preparation of specific additional, temporary, mitigation measures and selective evacuation, as appropriate. In contrast, while earthquake potential of a given region may be well recognized, the actual timing of the event is an unknown and, consequently, the primary defense is in developing sufficiently resilient infrastructure which can be enhanced with early warning systems. Similarly, the type of damage caused by flooding, e.g. hurricane and tsunami, is significantly different from the type of damage caused by an earthquake in that flooding damage is pervasive affecting large contiguous areas wiping out all infrastructure whereas earthquake or landslide damage tends to be clustered with many elements of infrastructure remaining fully or somewhat operable. This distinction is very important when it comes to the type of technical guidance that is needed following such events. This presentation highlights lessons learned from post-event reconnaissance as a part of the NSF-funded Geotechnical Extreme Event Reconnaissance (GEER) over the last two decades.
* PEER's Industry Partners * PEER's Educational Affiliates * History of PEER * Technology Transfer * Annual * PEER's Educational Affiliates * Student Design Competition * Student Leadership Council * Classes and Other Educational Activities Frequently Asked Questions Links - Important Earthquake Engineering
Publications - RI 2016-2 | Alaska Division of Geological & Geophysical
Tidal Datum Portal Climate and Cryosphere Hazards Coastal Hazards Program Guide to Geologic Hazards in ; Bathymetry; Coastal; Coastal and River; Earthquake Related Slope Failure; Emergency Preparedness; Engineering
Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines
Schiff, Anshel J.
1998-01-01
To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.
New "Risk-Targeted" Seismic Maps Introduced into Building Codes
Luco, Nicholas; Garrett, B.; Hayes, J.
2012-01-01
Throughout most municipalities of the United States, structural engineers design new buildings using the U.S.-focused International Building Code (IBC). Updated editions of the IBC are published every 3 years. The latest edition (2012) contains new "risk-targeted maximum considered earthquake" (MCER) ground motion maps, which are enabling engineers to incorporate a more consistent and better defined level of seismic safety into their building designs.
The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan
NASA Astrophysics Data System (ADS)
Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.
2011-12-01
Taiwan is located at the boundary between the Philippine Sea Plate and the Eurasian Plate, with a convergence rate of ~ 80 mm/yr in a ~N118E direction. The plate motion is so active that earthquake is very frequent. In the Taiwan area, disaster-inducing earthquakes often result from active faults. For this reason, it's an important subject to understand the activity and hazard of active faults. The active faults in Taiwan are mainly located in the Western Foothills and the Eastern longitudinal valley. Active fault distribution map published by the Central Geological Survey (CGS) in 2010 shows that there are 31 active faults in the island of Taiwan and some of which are related to earthquake. Many researchers have investigated these active faults and continuously update new data and results, but few people have integrated them for time-dependent earthquake hazard assessment. In this study, we want to gather previous researches and field work results and then integrate these data as an active fault parameters table for time-dependent earthquake hazard assessment. We are going to gather the seismic profiles or earthquake relocation of a fault and then combine the fault trace on land to establish the 3D fault geometry model in GIS system. We collect the researches of fault source scaling in Taiwan and estimate the maximum magnitude from fault length or fault area. We use the characteristic earthquake model to evaluate the active fault earthquake recurrence interval. In the other parameters, we will collect previous studies or historical references and complete our parameter table of active faults in Taiwan. The WG08 have done the time-dependent earthquake hazard assessment of active faults in California. They established the fault models, deformation models, earthquake rate models, and probability models and then compute the probability of faults in California. Following these steps, we have the preliminary evaluated probability of earthquake-related hazards in certain faults in Taiwan. By accomplishing active fault parameters table in Taiwan, we would apply it in time-dependent earthquake hazard assessment. The result can also give engineers a reference for design. Furthermore, it can be applied in the seismic hazard map to mitigate disasters.
Structure-specific scalar intensity measures for near-source and ordinary earthquake ground motions
Luco, N.; Cornell, C.A.
2007-01-01
Introduced in this paper are several alternative ground-motion intensity measures (IMs) that are intended for use in assessing the seismic performance of a structure at a site susceptible to near-source and/or ordinary ground motions. A comparison of such IMs is facilitated by defining the "efficiency" and "sufficiency" of an IM, both of which are criteria necessary for ensuring the accuracy of the structural performance assessment. The efficiency and sufficiency of each alternative IM, which are quantified via (i) nonlinear dynamic analyses of the structure under a suite of earthquake records and (ii) linear regression analysis, are demonstrated for the drift response of three different moderate- to long-period buildings subjected to suites of ordinary and of near-source earthquake records. One of the alternative IMs in particular is found to be relatively efficient and sufficient for the range of buildings considered and for both the near-source and ordinary ground motions. ?? 2007, Earthquake Engineering Research Institute.
Research on the spatial analysis method of seismic hazard for island
NASA Astrophysics Data System (ADS)
Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying
2017-05-01
Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.
Jibson, R.W.; Harp, E.L.; Schulz, W.; Keefer, D.K.
2004-01-01
The 2002 M7.9 Denali fault, Alaska, earthquake triggered thousands of landslides, primarily rock falls and rock slides, that ranged in volume from rock falls of a few cubic meters to rock avalanches having volumes as great as 15 ?? 106 m3. The pattern of landsliding was unusual; the number of slides was less than expected for an earthquake of this magnitude, and the landslides were concentrated in a narrow zone 30-km wide that straddled the fault rupture over its entire 300-km length. The large rock avalanches all clustered along the western third of the rupture zone where acceleration levels and ground-shaking frequencies are thought to have been the highest. Inferences about near-field strong shaking characteristics drawn from the interpretation of the landslide distribution are consistent with results of recent inversion modeling that indicate high-frequency energy generation was greatest in the western part of the fault rupture zone and decreased markedly to the east. ?? 2004, Earthquake Engineering Research Institute.
Predicted liquefaction of East Bay fills during a repeat of the 1906 San Francisco earthquake
Holzer, T.L.; Blair, J.L.; Noce, T.E.; Bennett, M.J.
2006-01-01
Predicted conditional probabilities of surface manifestations of liquefaction during a repeat of the 1906 San Francisco (M7.8) earthquake range from 0.54 to 0.79 in the area underlain by the sandy artificial fills along the eastern shore of San Francisco Bay near Oakland, California. Despite widespread liquefaction in 1906 of sandy fills in San Francisco, most of the East Bay fills were emplaced after 1906 without soil improvement to increase their liquefaction resistance. They have yet to be shaken strongly. Probabilities are based on the liquefaction potential index computed from 82 CPT soundings using median (50th percentile) estimates of PGA based on a ground-motion prediction equation. Shaking estimates consider both distance from the San Andreas Fault and local site conditions. The high probabilities indicate extensive and damaging liquefaction will occur in East Bay fills during the next M ??? 7.8 earthquake on the northern San Andreas Fault. ?? 2006, Earthquake Engineering Research Institute.
Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment
Lin, K.-W.; Wald, D.J.
2012-01-01
When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.
Developing a Scenario for widespread use: Best practices, lessons learned
Perry, S.; Jones, L.; Cox, D.
2011-01-01
The ShakeOut Scenario is probably the most widely known and used earthquake scenario created to date. Much of the credit for its widespread dissemination and application lies with scenario development criteria that focused on the needs and involvement of end users and with a suite of products that tailored communication of the results to varied end users, who ranged from emergency managers to the general public, from corporations to grassroots organizations. Products were most effective when they were highly visual, when they emphasized the findings of social scientists, and when they communicated the experience of living through the earthquake. This paper summarizes the development criteria and the products that made the ShakeOut Scenario so widely known and used, and it provides some suggestions for future improvements. ?? 2011, Earthquake Engineering Research Institute.
Performance of San Fernando dams during 1994 Northridge earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardet, J.P.; Davis, C.A.
1996-07-01
The 1994 Northridge and 1971 San Fernando Earthquakes subjected the Lower and Upper San Fernando Dams of the Van Norman Complex in the San Fernando Valley, Calif., to strong near-source ground motions. In 1994, these earth dams, which were out of service and retained only a few meters of water, extensively cracked and settled due to the liquefaction of their hydraulic fill. The Lower San Fernando Dam moved over 15 cm upstream as the hydraulic fill liquefied beneath its upstream slope. The Upper San Fernando Dam moved even more and deformed in a complicated three-dimensional pattern. The responses of themore » Lower and Upper San Fernando Dams during the 1994 Northridge Earthquake, although less significant than in 1971, provide the geotechnical engineering community with two useful case histories.« less
Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Marina District
O'Rourke, Thomas D.
1992-01-01
During the earthquake, a total land area of about 4,300 km2 was shaken with seismic intensities that can cause significant damage to structures. The area of the Marina District of San Francisco is only 4.0 km2--less than 0.1 percent of the area most strongly affected by the earthquake--but its significance with respect to engineering, seismology, and planning far outstrips its proportion of shaken terrain and makes it a centerpiece for lessons learned from the earthquake. The Marina District provides perhaps the most comprehensive case history of seismic effects at a specific site developed for any earthquake. The reports assembled in this chapter, which provide an account of these seismic effects, constitute a unique collection of studies on site, as well as infrastructure and societal, response that cover virtually all aspects of the earthquake, ranging from incoming ground waves to the outgoing airwaves used for emergency communication. The Marina District encompasses the area bounded by San Francisco Bay on the north, the Presidio on the west, and Lombard Street and Van Ness Avenue on the south and east, respectively. Nearly all of the earthquake damage in the Marina District, however, occurred within a considerably smaller area of about 0.75 km2, bounded by San Francisco Bay and Baker, Chestnut, and Buchanan Streets. At least five major aspects of earthquake response in the Marina District are covered by the reports in this chapter: (1) dynamic site response, (2) soil liquefaction, (3) lifeline performance, (4) building performance, and (5) emergency services.
Principles for selecting earthquake motions in engineering design of large dams
Krinitzsky, E.L.; Marcuson, William F.
1983-01-01
This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at a site may be obtained from several methods that involve magnitude of earthquake, distance from source, and corresponding motions; or, alternately, peak motions may be assigned from other correlations based on earthquake intensity. Various interpretations exist to account for duration, recurrence, effects of site conditions, etc. Comparison of the various interpretations can be very useful. Probabilities can be assigned; however, they can present very serious problems unless appropriate care is taken when data are extrapolated beyond their data base. In making deterministic judgments, probabilistic data can provide useful guidance in estimating the uncertainties of the decision. The selection of a design ground motion for large dams is based in the end on subjective judgments which should depend, to an important extent, on the consequences of failure. Usually, use of a design value of ground motion representing a mean plus one standard deviation of possible variation in the mean of the data puts one in a conservative position. If failure presents no hazard to life, lower values of design ground motion may be justified, providing there are cost benefits and the risk is acceptable to the owner. Where a large hazard to life exists (i.e., a dam above an urbanized area) one may wish to use values of design ground motion that approximate the very worst case. The selection of a design ground motion must be appropriate for its particular set of circumstances.
NASA Astrophysics Data System (ADS)
Singh, R. P.; Ahmad, R.
2015-12-01
A comparison of recent observed ground motion parameters of recent Gorkha Nepal earthquake of 25 April 2015 (Mw 7.8) with the predicted ground motion parameters using exitsing attenuation relation of the Himalayan region will be presented. The recent earthquake took about 8000 lives and destroyed thousands of poor quality of buildings and the earthquake was felt by millions of people living in Nepal, China, India, Bangladesh, and Bhutan. The knowledge of ground parameters are very important in developing seismic code of seismic prone regions like Himalaya for better design of buildings. The ground parameters recorded in recent earthquake event and aftershocks are compared with attenuation relations for the Himalayan region, the predicted ground motion parameters show good correlation with the observed ground parameters. The results will be of great use to Civil engineers in updating existing building codes in the Himlayan and surrounding regions and also for the evaluation of seismic hazards. The results clearly show that the attenuation relation developed for the Himalayan region should be only used, other attenuation relations based on other regions fail to provide good estimate of observed ground motion parameters.
Updating the USGS seismic hazard maps for Alaska
Mueller, Charles; Briggs, Richard; Wesson, Robert L.; Petersen, Mark D.
2015-01-01
The U.S. Geological Survey makes probabilistic seismic hazard maps and engineering design maps for building codes, emergency planning, risk management, and many other applications. The methodology considers all known earthquake sources with their associated magnitude and rate distributions. Specific faults can be modeled if slip-rate or recurrence information is available. Otherwise, areal sources are developed from earthquake catalogs or GPS data. Sources are combined with ground-motion estimates to compute the hazard. The current maps for Alaska were developed in 2007, and included modeled sources for the Alaska-Aleutian megathrust, a few crustal faults, and areal seismicity sources. The megathrust was modeled as a segmented dipping plane with segmentation largely derived from the slip patches of past earthquakes. Some megathrust deformation is aseismic, so recurrence was estimated from seismic history rather than plate rates. Crustal faults included the Fairweather-Queen Charlotte system, the Denali–Totschunda system, the Castle Mountain fault, two faults on Kodiak Island, and the Transition fault, with recurrence estimated from geologic data. Areal seismicity sources were developed for Benioff-zone earthquakes and for crustal earthquakes not associated with modeled faults. We review the current state of knowledge in Alaska from a seismic-hazard perspective, in anticipation of future updates of the maps. Updated source models will consider revised seismicity catalogs, new information on crustal faults, new GPS data, and new thinking on megathrust recurrence, segmentation, and geometry. Revised ground-motion models will provide up-to-date shaking estimates for crustal earthquakes and subduction earthquakes in Alaska.
Evolution of Dynamic Analysis in Geotechnical Earthquake Engineering
DOT National Transportation Integrated Search
1995-02-01
The Intermodal Surface Transportation Efficiency Act (ISTEA) of 1991 calls for a study of U.S. international border crossings. The objective of the study is to identify existing and emerging trade corridors and transportation subsystems that facilita...
ASBESTOS RELEASE DURING BUILDING DEMOLITION ACTIVITIES
The U.S. Environmental Protection Agency's (EPA) Risk Reduction Engineering Laboratory (RREL) monitored block-wide building demolition and debris disposal activities at Santa Cruz and Watsonsville, California following the 1989 earthquake; an implosion demolition of a 26-story bu...
Running SW4 On New Commodity Technology Systems (CTS-1) Platform
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rodgers, Arthur J.; Petersson, N. Anders; Pitarka, Arben
We have recently been running earthquake ground motion simulations with SW4 on the new capacity computing systems, called the Commodity Technology Systems - 1 (CTS-1) at Lawrence Livermore National Laboratory (LLNL). SW4 is a fourth order time domain finite difference code developed by LLNL and distributed by the Computational Infrastructure for Geodynamics (CIG). SW4 simulates seismic wave propagation in complex three-dimensional Earth models including anelasticity and surface topography. We are modeling near-fault earthquake strong ground motions for the purposes of evaluating the response of engineered structures, such as nuclear power plants and other critical infrastructure. Engineering analysis of structures requiresmore » the inclusion of high frequencies which can cause damage, but are often difficult to include in simulations because of the need for large memory to model fine grid spacing on large domains.« less
Strong-Motion Program report, January-December 1985
Porcella, R. L.
1989-01-01
This Program Report contains preliminary information on the nature and availability of strong-motion data recorded by the U.S. Geological Survey (USGS). The Strong-Motion Program is operated by the USGS in cooperation with numerous Federal, State, and local agencies and private organizations. Major objective of this program are to record both strong ground motion and the response of various types of engineered structures during earthquakes, and to disseminate this information and data to the international earthquake-engineering research and design community. This volume contains a summary of the accelerograms recovered from the USGS National Strong-Motion Instrumentation Network during 1985, summaries of recent strong-motion publications, notes on the availability of digitized data, and general information related to the USGS and other strong-motion programs. The data summary in table 1 contains information on all USGS accelerograms recovered (though not necessarily recorded) during 1985; event data are taken from "Preliminary Determination of Epicenters," published by the USGS.
Superelastic SMA U-shaped dampers with self-centering functions
NASA Astrophysics Data System (ADS)
Wang, Bin; Zhu, Songye
2018-05-01
As high-performance metallic materials, shape memory alloys (SMAs) have been investigated increasingly by the earthquake engineering community in recent years, because of their remarkable self-centering (SC) and energy-dissipating capabilities. This paper systematically presents an experimental study on a novel superelastic SMA U-shaped damper (SMA-UD) with SC function under cyclic loading. The mechanical properties, including strength, SC ability, and energy-dissipating capability with varying loading amplitudes and strain rates are evaluated. Test results show that excellent and stable flag-shaped hysteresis loops are exhibited in multiple loading cycles. Strain rate has a negligible effect on the cyclic behavior of the SMA-UD within the dynamic frequency range of typical interest in earthquake engineering. Furthermore, a numerical investigation is performed to understand the mechanical behavior of the SMA-UD. The numerical model is calibrated against the experimental results with reasonable accuracy. Then, the stress–strain states with different phase transformations are also discussed.
NASA Astrophysics Data System (ADS)
Cramer, C. H.; Kutliroff, J.; Dangkua, D.
2011-12-01
The M5.8 Mineral, Virginia earthquake of August 23, 2011 is the largest instrumentally recorded earthquake in eastern North America since the 1988 M5.9 Saguenay, Canada earthquake. Historically, a similar magnitude earthquake occurred on May 31, 1897 at 18:58 UCT in western Virginia west of Roanoke. Paleoseismic evidence for larger magnitude earthquakes has also been found in the central Virginia region. The Next Generation Attenuation (NGA) East project to develop new ground motion prediction equations for stable continental regions (SCRs), including eastern North America (ENA), is ongoing at the Pacific Earthquake Engineering Research Center funded by the U.S. Nuclear Regulatory Commission, the U.S. Geological Survey, the Electric Power Research Institute, and the U.S. Department of Energy. The available recordings from the M5.8 Virginia are being added to the NGA East ground motion database. Close in (less than 100 km) strong motion recordings are particularly interesting for both ground motion and stress drop estimates as most close-in broadband seismometers clipped on the mainshock. A preliminary estimate for earthquake corner frequency for the M5.8 Virginia earthquake of ~0.7 Hz has been obtained from a strong motion record 57 km from the mainshock epicenter. For a M5.8 earthquake this suggests a Brune stress drop of ~300 bars for the Virginia event. Very preliminary comparisons using accelerometer data suggest the ground motions from the M5.8 Virginia earthquake agree well with current ENA ground motion prediction equations (GMPEs) at short periods (PGA, 0.2 s) and are below the GMPEs at longer periods (1.0 s), which is the same relationship seen from other recent M5 ENA earthquakes. We will present observed versus GMPE ground motion comparisons for all the ground motion observations and stress drop estimates from strong motion recordings at distances less than 100 km. A review of the completed NGA East ENA ground motion database will also be provided.
PAGER-CAT: A composite earthquake catalog for calibrating global fatality models
Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.
2009-01-01
We have described the compilation and contents of PAGER-CAT, an earthquake catalog developed principally for calibrating earthquake fatality models. It brings together information from a range of sources in a comprehensive, easy to use digital format. Earthquake source information (e.g., origin time, hypocenter, and magnitude) contained in PAGER-CAT has been used to develop an Atlas of Shake Maps of historical earthquakes (Allen et al. 2008) that can subsequently be used to estimate the population exposed to various levels of ground shaking (Wald et al. 2008). These measures will ultimately yield improved earthquake loss models employing the uniform hazard mapping methods of ShakeMap. Currently PAGER-CAT does not consistently contain indicators of landslide and liquefaction occurrence prior to 1973. In future PAGER-CAT releases we plan to better document the incidence of these secondary hazards. This information is contained in some existing global catalogs but is far from complete and often difficult to parse. Landslide and liquefaction hazards can be important factors contributing to earthquake losses (e.g., Marano et al. unpublished). Consequently, the absence of secondary hazard indicators in PAGER-CAT, particularly for events prior to 1973, could be misleading to sorne users concerned with ground-shaking-related losses. We have applied our best judgment in the selection of PAGER-CAT's preferred source parameters and earthquake effects. We acknowledge the creation of a composite catalog always requires subjective decisions, but we believe PAGER-CAT represents a significant step forward in bringing together the best available estimates of earthquake source parameters and reports of earthquake effects. All information considered in PAGER-CAT is stored as provided in its native catalog so that other users can modify PAGER preferred parameters based on their specific needs or opinions. As with all catalogs, the values of some parameters listed in PAGER-CAT are highly uncertain, particularly the casualty numbers, which must be regarded as estimates rather than firm numbers for many earthquakes. Consequently, we encourage contributions from the seismology and earthquake engineering communities to further improve this resource via the Wikipedia page and personal communications, for the benefit of the whole community.
NASA Astrophysics Data System (ADS)
Sadeghi, H.
2015-12-01
Bridges are major elements of infrastructure in all societies. Their safety and continued serviceability guaranties the transportation and emergency access in urban and rural areas. However, these important structures are subject to earthquake induced damages in structure and foundations. The basic approach to the proper support of foundations are a) distribution of imposed loads to foundation in a way they can resist those loads without excessive settlement and failure; b) modification of foundation ground with various available methods; and c) combination of "a" and "b". The engineers has to face the task of designing the foundations meeting all safely and serviceability criteria but sometimes when there are numerous environmental and financial constrains, the use of some traditional methods become inevitable. This paper explains the application of timber piles to improve ground resistance to liquefaction and to secure the abutments of short to medium length bridges in an earthquake/liquefaction prone area in Bohol Island, Philippines. The limitations of using the common ground improvement methods (i.e., injection, dynamic compaction) because of either environmental or financial concerns along with the abundance of timber in the area made the engineers to use a network of timber piles behind the backwalls of the bridge abutments. The suggested timber pile network is simulated by numerical methods and its safety is examined. The results show that the compaction caused by driving of the piles and bearing capacity provided by timbers reduce the settlement and lateral movements due to service and earthquake induced loads.
Seismic Analysis of Intake Towers
1982-10-01
Experiment Station (WES) under the sponsorship of the Directorate of Civil Works of the Office, Chief of Engineers, U. S. Army. The work was funded under...the structural capacity of the intake S,-tower are contained in Engineer Technical Letter (ETL) 1110-2-265 " Civil Systems Incorporated, "Dynamic...Berkeley, Calif. " ___ 1975. "Earthquake Resistant Design of Intake-Outlet Towers," Journal of the Structural Division_ American Society of Civil
Processing of strong-motion accelerograms: Needs, options and consequences
Boore, D.M.; Bommer, J.J.
2005-01-01
Recordings from strong-motion accelerographs are of fundamental importance in earthquake engineering, forming the basis for all characterizations of ground shaking employed for seismic design. The recordings, particularly those from analog instruments, invariably contain noise that can mask and distort the ground-motion signal at both high and low frequencies. For any application of recorded accelerograms in engineering seismology or earthquake engineering, it is important to identify the presence of this noise in the digitized time-history and its influence on the parameters that are to be derived from the records. If the parameters of interest are affected by noise then appropriate processing needs to be applied to the records, although it must be accepted from the outset that it is generally not possible to recover the actual ground motion over a wide range of frequencies. There are many schemes available for processing strong-motion data and it is important to be aware of the merits and pitfalls associated with each option. Equally important is to appreciate the effects of the procedures on the records in order to avoid errors in the interpretation and use of the results. Options for processing strong-motion accelerograms are presented, discussed and evaluated from the perspective of engineering application. ?? 2004 Elsevier Ltd. All rights reserved.
A high resolution 3D velocity model beneath the Tokyo Metropolitan area by MeSO-net
NASA Astrophysics Data System (ADS)
Nakagawa, S.; Sakai, S.; Honda, R.; Kimura, H.; Hirata, N.
2015-12-01
Beneath the Tokyo metropolitan area, the Philippine Sea Plate (PSP) subducts and causes devastating mega-thrust earthquakes, such as the 1703 Genroku earthquake (M8.0) and the 1923 Kanto earthquake (M7.9). An M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating serious loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that an M7+ earthquake will cause 23,000 fatalities and 95 trillion yen (about 1 trillion US$) economic loss. We have launched the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters in collaboration with scientists, engineers, and social-scientists in nationwide institutions since 2012. We analyze data from the dense seismic array called Metropolitan Seismic Observation network (MeSO-net), which has 296 seismic stations with spacing of 5 km (Sakai and Hirata, 2009; Kasahara et al., 2009). We applied the double-difference tomography method (Zhang and Thurber, 2003) and estimated the velocity structure and the upper boundary of PSP (Nakagawa et al., 2010). The 2011 Tohoku-oki earthquake (M9.0) has activated seismicity also in Kanto region, providing better coverage of ray paths for tomographic analysis. We obtain much higher resolution velocity models from whole dataset observed by MeSO-net between 2008 and 2015. A detailed image of tomograms shows that PSP contacts Pacific plate at a depth of 50 km beneath northern Tokyo bay. A variation of velocity along the oceanic crust suggests dehydration reaction to produce seismicity in a slab, which may related to the M7+ earthquake. Acknowledgement: This study was supported by the Special Project for Reducing Vulnerability for Urban Mega Earthquake Disasters of MEXT, Japan and the Earthquake Research Institute cooperative research program.
Matrix Perturbation Techniques in Structural Dynamics
NASA Technical Reports Server (NTRS)
Caughey, T. K.
1973-01-01
Matrix perturbation are developed techniques which can be used in the dynamical analysis of structures where the range of numerical values in the matrices extreme or where the nature of the damping matrix requires that complex valued eigenvalues and eigenvectors be used. The techniques can be advantageously used in a variety of fields such as earthquake engineering, ocean engineering, aerospace engineering and other fields concerned with the dynamical analysis of large complex structures or systems of second order differential equations. A number of simple examples are included to illustrate the techniques.
Horton, J. Wright; Chapman, Martin C.; Green, Russell A.
2015-01-01
This book grew out of a topical session on “Central Virginia Earthquakes of 2011: Geology, Geophysics, and Significance for Seismic Hazards in Eastern North America” at the 2012 The Geological Society of America (GSA) Annual Meeting in Charlotte, North Carolina (USA). It also benefitted from related sessions at other meetings. The goal of this volume, The 2011 Mineral, Virginia, Earthquake, and Its Significance for Seismic Hazards in Eastern North America, is to bring together as much information as possible on lessons learned from this rare event. Chapters encompass a wide range of geoscience, engineering, and related studies of this earthquake and its effects from the epicentral area in central Virginia to Washington, D.C., and beyond. The intended audience is a broad spectrum of geoscientists, engineers, and decision makers interested in understanding earthquakes and seismic hazards in eastern North America and other intraplate settings. Chapters by Berti et al. (21), Chapman (2), Costain (8), Davenport et al. (15), Green et al. (9), Heller and Carter (10), Horton et al. (14), Hughes et al. (19), Powars et al. (23), Pratt et al. (16), Roeloffs et al. (7), Shah et al. (17), Stephenson et al. (3), Walsh et al. (18), and Wells et al. (12) are expansions of presentations at the 2012 GSA meeting. The volume also contains chapters from recent studies that were not presented at the GSA meeting, including those by Bobyarchick (22), Burton et al. (20), Dreiling and Mooney (5), Li et al. (11), McNamara et al. (4), Pollitz and Mooney (6), and Shahidi et al. (13). Following an overview and synthesis by the volume editors (1), chapters are arranged under the topical headings “Seismology and Regional Effects,” “Earthquake Damage, Geotechnical, and Engineering Investigations,” “Aftershocks, Geophysical Imaging, and Modeling,” “Geologic Investigations—Epicentral Area,” and “Geologic Investigations— Central Virginia Seismic Zone and Nearby Faults.”We thank the authors for their contributions and the many scientists and engineers who contributed time and expertise in reviewing manuscripts to substantially improve the quality of the volume. These reviewers include Gail Atkinson, Christopher Bailey, Richard Berquist, Kimberly Blisniuk, Paul Bodin, Aaron Bradshaw, Clive Collins, Ariel Conn, Randy Cox, Haitham Dawood, James Dewey, John Ebel, David Fenster, Alexander Gates, Kathleen Haller, Gregory Hancock, Robert Hatcher, William Henika, Paul Hsieh, Steven Jaumé, Jeffrey Kimball, Charles Langston, Jongwon Lee, Andrea Llenos, John McBride, Scott Olson, Michael Oskin, Brent Owens, Gilles Peltzer, Mark Quigley, Dhananjay Ravat, David Saftner, Arthur Snoke, Jamison Steidl, Kevin Stewart, Alice Stieve, Danielle Sumy, Ertugrul Taciroglu, Roy Van Arsdale, Mason Walters, Chiyuen Wang, Yang Wang, Richard Whittecar, Lorraine Wolf, Clint Wood, Liam Wotherspoon, and some anonymous reviewers.
NASA Astrophysics Data System (ADS)
Dandoulaki, M.; Kourou, A.; Panoutsopoulou, M.
2009-04-01
It is vastly accepted that earthquake education is the way to earthquake protection. Nonetheless experience demonstrates that knowing what to do does not necessarily result in a better behaviour in case of a real earthquake. A research project titled: "Seismopolis" - "Pilot integrated System for Public Familiarization with Earthquakes and Information on Earthquake Protection" aimed at the improvement of the behaviour of people through an appropriate amalgamation of knowledge transfer and virtually experiencing an earthquake situation. Seismopolis combines well established education means such as books and leaflets with new technologies like earthquake simulation and virtual reality. It comprises a series of 5 main spaces that the visitor passes one-by-one. Space 1. Reception and introductory information. Visitors are given fundamental information on earthquakes and earthquake protection, as well as on the appropriate behaviour in case of an earthquake. Space 2. Earthquake simulation room Visitors experience an earthquake in a room. A typical kitchen is set on a shake table area (3m x 6m planar triaxial shake table) and is shaken in both horizontal and vertical directions by introducing seismographs of real or virtual earthquakes. Space 3. Virtual reality room Visitors may have the opportunity to virtually move around in the building or in the city after an earthquake disaster and take action as in a real-life situation, wearing stereoscopic glasses and using navigation tools. Space 4. Information and resources library Visitors are offered the opportunity to know more about earthquake protection. A series of means are available for this, some developed especially for Seismopolis (3 books, 2 Cds, a website and an interactive table game). Space 5. De-briefing area Visitors may be subjected to a pedagogical and psychological evaluation at the end of their visit and offered support if needed. For the evaluation of the "Seismopolis" Centre, a pilot application of the complete complex took place with the participation of different groups (schoolchildren, university students, adults, elderly persons, emigrants and persons with special needs). This test period recorded positive impression and reaction from the visitors and indicated the pedagogical and psychological appropriateness of the system. Seismopolis is the outcome of collaboration of public, academic and private partners and of a range of disciplines, namely seismologists, geologists, structural engineers, geographers, sociologists and psycologists. It is actually hosted by the Municipality of Rendis in Athens. More information on Seismopolis can be found in www.seismopolis.org .
The HayWired earthquake scenario—Engineering implications
Detweiler, Shane T.; Wein, Anne M.
2018-04-18
The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.
The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppari, S.; Di Pasquale, G.; Goretti, A.
2008-07-08
The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspectionmore » teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide dissemination activities, the final aim of the project is to ensure the deployment and the integration into existing earthquake mitigation policies and vocational training schemes.« less
The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment
NASA Astrophysics Data System (ADS)
Coppari, S.; Di Pasquale, G.; Goretti, A.; Papa, F.; Papa, S.; Paoli, G.; Pizza, A. G.; Severino, M.
2008-07-01
The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspection teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide dissemination activities, the final aim of the project is to ensure the deployment and the integration into existing earthquake mitigation policies and vocational training schemes.
An Earthquake Shake Map Routine with Low Cost Accelerometers: Preliminary Results
NASA Astrophysics Data System (ADS)
Alcik, H. A.; Tanircan, G.; Kaya, Y.
2015-12-01
Vast amounts of high quality strong motion data are indispensable inputs of the analyses in the field of geotechnical and earthquake engineering however, high cost of installation of the strong motion systems constitutes the biggest obstacle for worldwide dissemination. In recent years, MEMS based (micro-electro-mechanical systems) accelerometers have been used in seismological research-oriented studies as well as earthquake engineering oriented projects basically due to precision obtained in downsized instruments. In this research our primary goal is to ensure the usage of these low-cost instruments in the creation of shake-maps immediately after a strong earthquake. Second goal is to develop software that will automatically process the real-time data coming from the rapid response network and create shake-map. For those purposes, four MEMS sensors have been set up to deliver real-time data. Data transmission is done through 3G modems. A subroutine was coded in assembler language and embedded into the operating system of each instrument to create MiniSEED files with packages of 1-second instead of 512-byte packages.The Matlab-based software calculates the strong motion (SM) parameters at every second, and they are compared with the user-defined thresholds. A voting system embedded in the software captures the event if the total vote exceeds the threshold. The user interface of the software enables users to monitor the calculated SM parameters either in a table or in a graph (Figure 1). A small scale and affordable rapid response network is created using four MEMS sensors, and the functionality of the software has been tested and validated using shake table tests. The entire system is tested together with a reference sensor under real strong ground motion recordings as well as series of sine waves with varying amplitude and frequency. The successful realization of this software allowed us to set up a test network at Tekirdağ Province, the closest coastal point to the moderate size earthquake activities in the Marmara Sea, Turkey.
Implementing Nepal's national building code—A case study in patience and persistence
Arendt, Lucy; Hortacsu, Ayse; Jaiswal, Kishor; Bevington, John; Shrestha, Surya; Lanning, Forrest; Mentor-William, Garmalia; Naeem, Ghazala; Thibert, Kate
2017-01-01
The April 2015 Gorkha Nepal earthquake revealed the relative effectiveness of the Nepal Standard, or national building code (NBC), and irregular compliance with it in different parts of Nepal. Much of the damage to more than half a million Nepal's residential structures may be attributed to the prevalence of owner-built or owner-supervised construction and the lack of owner and builder responsiveness to seismic risk and training in the appropriate means of complying with the NBC. To explain these circumstances, we review the protracted implementation of the NBC and the role played by one organization, the National Society for Earthquake Technology-Nepal (NSET), in the NBC's implementation. We also share observations on building code compliance made by individuals in Nepal participating in workshops led by the Earthquake Engineering Research Institute's 2014 class of Housner Fellows.
Rrsm: The European Rapid Raw Strong-Motion Database
NASA Astrophysics Data System (ADS)
Cauzzi, C.; Clinton, J. F.; Sleeman, R.; Domingo Ballesta, J.; Kaestli, P.; Galanis, O.
2014-12-01
We introduce the European Rapid Raw Strong-Motion database (RRSM), a Europe-wide system that provides parameterised strong motion information, as well as access to waveform data, within minutes of the occurrence of strong earthquakes. The RRSM significantly differs from traditional earthquake strong motion dissemination in Europe, which has focused on providing reviewed, processed strong motion parameters, typically with significant delays. As the RRSM provides rapid open access to raw waveform data and metadata and does not rely on external manual waveform processing, RRSM information is tailored to seismologists and strong-motion data analysts, earthquake and geotechnical engineers, international earthquake response agencies and the educated general public. Access to the RRSM database is via a portal at http://www.orfeus-eu.org/rrsm/ that allows users to query earthquake information, peak ground motion parameters and amplitudes of spectral response; and to select and download earthquake waveforms. All information is available within minutes of any earthquake with magnitude ≥ 3.5 occurring in the Euro-Mediterranean region. Waveform processing and database population are performed using the waveform processing module scwfparam, which is integrated in SeisComP3 (SC3; http://www.seiscomp3.org/). Earthquake information is provided by the EMSC (http://www.emsc-csem.org/) and all the seismic waveform data is accessed at the European Integrated waveform Data Archive (EIDA) at ORFEUS (http://www.orfeus-eu.org/index.html), where all on-scale data is used in the fully automated processing. As the EIDA community is continually growing, the already significant number of strong motion stations is also increasing and the importance of this product is expected to also increase. Real-time RRSM processing started in June 2014, while past events have been processed in order to provide a complete database back to 2005.
NASA Astrophysics Data System (ADS)
Tanaka, Y.; Hirayama, Y.; Kuroda, S.; Yoshida, M.
2015-12-01
People without severe disaster experience infallibly forget even the extraordinary one like 3.11 as time advances. Therefore, to improve the resilient society, an ingenious attempt to keep people's memory of disaster not to fade away is necessary. Since 2011, we have been caring out earthquake disaster drills for residents of high-rise apartments, for schoolchildren, for citizens of the coastal area, etc. Using a portable earthquake simulator (1), the drill consists of three parts, the first: a short lecture explaining characteristic quakes expected for Japanese people to have in the future, the second: reliving experience of major earthquakes hit Japan since 1995, and the third: a short lecture for preparation that can be done at home and/or in an office. For the quake experience, although it is two dimensional movement, the real earthquake observation record is used to control the simulator to provide people to relive an experience of different kinds of earthquake including the long period motion of skyscrapers. Feedback of the drill is always positive because participants understand that the reliving the quake experience with proper lectures is one of the best method to communicate the past disasters to their family and to inherit them to the next generation. There are several kinds of archive for disaster as inheritance such as pictures, movies, documents, interviews, and so on. In addition to them, here we propose to construct 'the archive of the quake experience' which compiles observed data ready to relive with the simulator. We would like to show some movies of our quake drill in the presentation. Reference: (1) Kuroda, S. et al. (2012), "Development of portable earthquake simulator for enlightenment of disaster preparedness", 15th World Conference on Earthquake Engineering 2012, Vol. 12, 9412-9420.
Yehle, Lynn A.
1974-01-01
A program to study the engineering geology of most of the larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about Sitka and vicinity is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are subject to revision as further information becomes available. This report can provide broad geologic guidelines for planners and engineers during preparation of land-use plans. The use of this information should lead to minimizing future loss of life and property due to geologic hazards, especially during very large earthquakes. Landscape of Sitka and surrounding area is characterized by numerous islands and a narrow strip of gently rolling ground adjacent to rugged mountains; steep valleys and some fiords cut sharply into the mountains. A few valley floors are wide and flat and grade into moderate-sized deltas. Glaciers throughout southeastern Alaska and elsewhere became vastly enlarged during the Pleistocene Epoch. The Sitka area presumably was covered by ice several times; glaciers deeply eroded some valleys and removed fractured bedrock along some faults. The last major deglaciation occurred sometime before 10,000 years ago. Crustal rebound believed to be related to glacial melting caused land emergence at Sitka of at least 35 feet (10.7 m) relative to present sea level. Bedrock at Sitka and vicinity is composed mostly of bedded, hard, dense graywacke and some argillite. Beds strike predominantly northwest and are vertical or steeply dipping. Locally, bedded rocks are cut by dikes of fine-grained igneous rock. Host bedrock is of Jurassic and Cretaceous age. Eight types of surficial deposits of Quaternary age were recognized. Below altitudes of 3S feet (10.7 m), the dominant deposits are those of modern and elevated shores and deltas; at higher altitudes, widespread muskeg overlies a mantle of volcanic ash which commonly overlies glacial drift. Alluvial deposits are minor. Man-emplaced embankment fill, chiefly sandy gravel, covers many muskeg and former offshore areas; quarried blocks of graywacke are placed to form breakwaters and to edge large areas of embankment fill and modified ground. The geologic structure of the area is known only in general outlines. Most bedded Mesozoic rocks probably are part of broad northwest-trending complexes of anticlines and synclines. Intrusion of large bodies of plutonic igneous rocks occurred in Tertiary and Cretaceous time. Extensive faulting is suggested by the numerous linear to gently curving patterns of some fiords, lakes, and valleys, and by a group of Holocene volcanoes and cinder cones. Two major northwest-striking fault zones are most prominent: (1) the apparently inactive Chichagof-Sitka fault, about 2.5 miles (4.0 km) northeast of Sitka, and {2) part of the active 800-mile- (1,200-km) long Fairweather-Queen Charlotte Islands fault system, lying about 30 miles (48 km) southwest of the city. Many earthquakes have been reported as felt at Sitka since 1832, when good records were first maintained; several shocks were very strong, but none of them caused severe damage. The closest major earthquake (magnitude about 7.3) causing some damage to the city occurred July 30, 1972, and had an epicenter about 30 miles (48 km) to the southwest. Movement along the Fairweather-Queen Charlotte Islands fault system apparently caused most of the earthquakes felt at Sitka. The probability of destructive earthquakes at Sitka is unknown. The tectonics of the region and the seismic record suggest that sometime in the future an earthquake of a magnitude of about 8 and related to the Fairweather-Queen Charlotte Islands fault system probably will occur in or near the area. Effects from some nearby major earthquakes could cause substantial damage at Sitka. Eight possible effects are as follows: 1. Sudden dis
Economic Impacts of Infrastructure Damages on Industrial Sector
NASA Astrophysics Data System (ADS)
Kajitani, Yoshio
This paper proposes a basic model for evaluating economic impacts on industrial sectors under the conditions that multiple infrastructures are simultaneously damaged during the earthquake disasters. Especially, focusing on the available economic data developed in the smallest spatial scale in Japan (small area statistics), economic loss estimation model based on the small area statistics and its applicability are investigated on. In the detail, a loss estimation framework, utilizing survey results on firms' activities under electricity, water and gas disruptions, and route choice models in Transportation Engineering, are applied to the case of 2004 Mid-Niigata Earthquake.
Open System for Earthquake Engineering Simulation - Home Page
-X, an expert system for reliable pre-and post-processing of buildings is now available for free /post processor GiD. The interface is available though the the GID+OpenSees website OpenSees Days Europe
NASA Astrophysics Data System (ADS)
Klose, C. D.
2006-12-01
This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5
Hazus® estimated annualized earthquake losses for the United States
Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean
2017-01-01
Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the public and private sectors. For example, potential losses to new buildings may be reduced by proper land-use planning, applying most current seismic design codes and using new technologies and specialized construction techniques. However, decisions to spend money on any of those solutions require benefit and cost comparison against the perceived risk. Previous versions of the FEMA 366 studies are the only nationally accepted criteria and methodology for comparing seismic risk across regions.
NASA Astrophysics Data System (ADS)
Castaldini, D.; Genevois, R.; Panizza, M.; Puccinelli, A.; Berti, M.; Simoni, A.
This paper illustrates research addressing the subject of the earthquake-induced surface effects by means of a multidisciplinary approach: tectonics, neotectonics, seismology, geology, hydrogeology, geomorphology, soil/rock mechanics have been considered. The research is aimed to verify in areas affected by earthquake-triggered landslides a methodology for the identification of potentially unstable areas. The research was organized according to regional and local scale studies. In order to better emphasise the complexity of the relationships between all the parameters affecting the stability conditions of rock slopes in static and dynamic conditions a new integrated approach, Rock Engineering Systems (RES), was applied in the Northern Apennines. In the paper, the different phases of the research are described in detail and an example of the application of RES method in a sample area is reported. A significant aspect of the study can be seen in its attempt to overcome the exclusively qualitative aspects of research into the relationship between earthquakes and induced surface effects, and to advance the idea of beginning a process by which this interaction can be quantified.
Relations between some horizontal‐component ground‐motion intensity measures used in practice
Boore, David; Kishida, Tadahiro
2017-01-01
Various measures using the two horizontal components of recorded ground motions have been used in a number of studies that derive ground‐motion prediction equations and construct maps of shaking intensity. We update relations between a number of these measures, including those in Boore et al. (2006) and Boore (2010), using the large and carefully constructed global database of ground motions from crustal earthquakes in active tectonic regions developed as part of the Pacific Earthquake Engineering Research Center–Next Generation Attenuation‐West2 project. The ratios from the expanded datasets generally agree to within a few percent of the previously published ratios. We also provide some ratios that were not considered before, some of which will be useful in applications such as constructing ShakeMaps. Finally, we compare two important ratios with those from a large central and eastern North American database and from many records from subduction earthquakes in Japan and Taiwan. In general, the ratios from these regions are within several percent of those from crustal earthquakes in active tectonic regions.
Economic consequences of earthquakes: bridging research and practice with HayWired
NASA Astrophysics Data System (ADS)
Wein, A. M.; Kroll, C.
2016-12-01
The U.S. Geological Survey partners with organizations and experts to develop multiple hazard scenarios. The HayWired earthquake scenario refers to a rupture of the Hayward fault in the Bay Area of California and addresses the potential chaos related to interconnectedness at many levels: the fault afterslip and aftershocks, interdependencies of lifelines, wired/wireless technology, communities at risk, and ripple effects throughout today's digital economy. The scenario is intended for diverse audiences. HayWired analyses translate earthquake hazards (surface rupture, ground shaking, liquefaction, landslides) into physical engineering and environmental health impacts, and into societal consequences. Damages to life and property and lifeline service disruptions are direct causes of business interruption. Economic models are used to estimate the economic impacts and resilience in the regional economy. The objective of the economic analysis is to inform policy discourse about economic resilience at all three levels of the economy: macro, meso, and micro. Stakeholders include businesses, economic development, and community leaders. Previous scenario analyses indicate the size of an event: large earthquakes and large winter storms are both "big ones" for California. They motivate actions to reduce the losses from fire following earthquake and water supply outages. They show the effect that resilience can have on reducing economic losses. Evaluators find that stakeholders learned the most about the economic consequences.
Road Damage Following Earthquake
NASA Technical Reports Server (NTRS)
1989-01-01
Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey
Yehle, Lynn A.
1977-01-01
A program to study the engineering geology of most larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about the Metlakatla area, Annette Island, is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are tentative. Landscape of the Metlakatla Peninsula, on which the city of Metlakatla is located, is characterized by a muskeg-covered terrane of very low relief. In contrast, most of the rest of Annette Island is composed of mountainous terrane with steep valleys and numerous lakes. During the Pleistocene Epoch the Metlakatla area was presumably covered by ice several times; glaciers smoothed the present Metlakatla Peninsula and deeply eroded valleys on the rest. of Annette Island. The last major deglaciation was completed probably before 10,000 years ago. Rebound of the earth's crust, believed to be related to glacial melting, has caused land emergence at Metlakatla of at least 50 ft (15 m) and probably more than 200 ft (61 m) relative to present sea level. Bedrock in the Metlakatla area is composed chiefly of hard metamorphic rocks: greenschist and greenstone with minor hornfels and schist. Strike and dip of beds are generally variable and minor offsets are common. Bedrock is of late Paleozoic to early Mesozoic age. Six types of surficial geologic materials of Quaternary age were recognized: firm diamicton, emerged shore, modern shore and delta, and alluvial deposits, very soft muskeg and other organic deposits, and firm to soft artificial fill. A combination map unit is composed of bedrock or diamicton. Geologic structure in southeastern Alaska is complex because, since at least early Paleozoic time, there have been several cycles of tectonic deformation that affected different parts of the region. Southeastern Alaska is transected by numerous faults and possible faults that attest to major movements of the earth's crust. The latest of the major tectonic events in the Metlakatla region occurred in middle Tertiary time; some minor fault activity probably continues today at depth. Along the outer coast of southeastern Alaska and British Columbia, major faulting activity occurs in the form of active, strike-slip movement along the Queen Charlotte fault about 100 mi (160 kin) west-southwest of Metlakatla. Some branching subsidiary faults also may be active, at least one of which may be the Sandspit fault. Many major and smaller earthquakes occur along the outer coast. These shocks are related to movements along the Queen Charlotte fault. A few small earthquakes occur in the region between the outer coast and the Coast Mountains, which includes Metlakatla. 0nly a few earthquakes have been reported as felt at Metlakatla; these shocks and others felt in the region are tabulated. Historically, the closest major earthquake was the magnitude 8.1 Queen Charlotte Islands earthquake of August 22, 1949, which occurred along the Queen Charlotte fault 125 mi (200 km) southwest of Metlakatla. No damage was reported at Metlakatla. The probability of destructive earthquakes affecting Metlakatla is unknown. A consideration of the tectonics and earthquake history of the region, however, suggests that sometime in the future an earthquake with a magnitude of about 8 will occur along that segment of the Queen Charlotte fault nearest to Metlakatla. Smaller earthquakes with magnitudes of 6 or more might occur elsewhere in the Metlakatla region or south-southeastward near Dixon Entrance or Hecate Strait. Several geologic effects that have characterized large earthquakes elsewh6re may be expected to accompany some of the possible major earthquakes that might affect the Metlakatla area in the future. Evaluation of effects indicates that fault displacement and tectonic uplift or subsidence are probably unlikely, and ground shaking in general probably would be strongest
Gas and Dust Phenomena of Mega-earthquakes and the Cause
NASA Astrophysics Data System (ADS)
Yue, Z.
2013-12-01
A mega-earthquake suddenly releases a large to extremely large amount of kinetic energy within a few tens to two hundreds seconds and over ten to hundreds kilometer distances in the Earth's crust and on ground surface. It also generates seismic waves that can be received globally and co-seismic ground damages such co-seismic ruptures and landslides. However, such vast, dramatic and devastating kinetic actions in the Earth's crustal rocks and on the ground soils cannot be known or predicted by people at few weeks, days, hours, or minutes before they are happening. Although seismologists can develop and use seismometers to report the locations and magnitudes of earthquakes within minutes of their occurrence, they cannot predict earthquakes at present. Therefore, damage earthquakes have caused and would continue to cause huge disasters, fatalities and injuries to our human beings. This problem may indicate that it is necessary to re-examine the cause of mega-earthquakes in addition to the conventional cause of active fault elastic rebounding. In the last ten years, many mega-earthquakes occurred in China and around the Pacific Ocean and caused many casualties to human beings and devastating disasters to environments. The author will give a brief review on the impacts of the mega-earthquakes happened in recent years. He will then present many gas and dust related phenomena associated with the sudden occurrences of these mega earthquakes. They include the 2001 Kunlunshan Earthquake M8.1, 2008 Wenchuan Earthquake M8.0 and the 2010 Yushu Earthquake M7.1 in China, the 2010 Haiti Earthquake M7.0, the 2010 Mexicali Earthquake M7.2, the 2010 Chile Earthquake M8.8, the 2011 Christchurch earthquake M6.3 and the 2011 Japan Earthquake M9.0 around the Pacific Ocean. He will discuss the cause of these gas and dust related phenomena. He will use these phenomena and their common cause to show that the earthquakes were caused the rapid migration and expansion of highly compressed and dense natural (methane) gas suddenly escaped from deep crust traps along deep fault zones. References Yue, ZQ, 2009. The source of energy power directly causing the May 12 Wenchuan Earthquake: Huge extremely pressurized natural gases trapped in deep Longmen Shan faults. News Journal of China Society of Rock Mechanics and Engineering, 86 (2009 (2)), 45-50. Yue, ZQ, 2010. Features and mechanism of coseismic surface ruptures by Wenchuan Earthquake. in Rock Stress and Earthquake, edited by Furen Xie, Taylor & Francis Group, London, ISBN 978-0-415-60165-8, 761-768. Yue, ZQ, 2013a. Natural gas eruption mechanism for earthquake landslides: illustrated with comparison between Donghekou and Papandayan Rockslide-debris flows. in Earthquake-induced Landslides, K. Ugai et al. (eds.), Springer-Verlage Berlin, Chapter 51: pp. 485-494 Yue ZQ, 2013b. On incorrectness in elastic rebound theory for cause of earthquakes. Paper No. S20-003 of Session S20, Proceedings of the 13th International Conference on Fracture, June 16-21, Beijing. Yue ZQ, 2013c. On nature of earthquakes with cause of compressed methane gas expansion and migration in crustal rocks, in Proceedings of Fifth Biot Conference on Poromechanics in Memory of Karl von Terzaghi (1883-1963), July 10-12, Vienna, edited by C. Hellmich et al, @ASCE, pp. 507-516.
On civil engineering disasters and their mitigation
NASA Astrophysics Data System (ADS)
Xie, Lili; Qu, Zhe
2018-01-01
Civil engineering works such as buildings and infrastructure are the carriers of human civilization. They are, however, also the origins of various types of disasters, which are referred to in this paper as civil engineering disasters. This paper presents the concept of civil engineering disasters, their characteristics, classification, causes, and mitigation technologies. Civil engineering disasters are caused primarily by civil engineering defects, which are usually attributed to improper selection of construction site, hazard assessment, design and construction, occupancy, and maintenance. From this viewpoint, many so-called natural disasters such as earthquakes, strong winds, floods, landslides, and debris flows are substantially due to civil engineering defects rather than the actual natural hazards. Civil engineering disasters occur frequently and globally and are the most closely related to human beings among all disasters. This paper emphasizes that such disasters can be mitigated mainly through civil engineering measures, and outlines the related objectives and scientific and technological challenges.
Seismic risk management of non-engineered buildings
NASA Astrophysics Data System (ADS)
Winar, Setya
Earthquakes have long been feared as one of nature's most terrifying and devastating events. Although seismic codes clearly exist in countries with a high seismic risk to save lives and human suffering, earthquakes still continue to cause tragic events with high death tolls, particularly due to the collapse of widespread non-engineered buildings with non-seismic resistance in developing countries such as Indonesia. The implementation of seismic codes in non-engineered construction is the key to ensuring earthquake safety. In fact, such implementation is not simple, because it comprises all forms of cross disciplinary and cross sectoral linkages at different levels of understanding, commitment, and skill. This fact suggests that a widely agreed framework can help to harmonise the various perspectives. Hence, this research is aimed at developing an integrated framework for guiding and monitoring seismic risk reduction of non-engineered buildings in Indonesia via a risk management method.Primarily, the proposed framework for the study has drawn heavily on wider literature, the three existing frameworks around the world, and on the contribution of various stakeholders who participated in the study. A postal questionnaire survey, selected interviews, and workshop event constituted the primary data collection methods. As a robust framework needed to be achieved, the following two workshop events, which were conducted in Yogyakarta City and Bengkulu City in Indonesia, were carried out for practicality, validity, and moderation or any identifiable improvement requirements. The data collected was analysed with the assistance of SPSS and NVivo software programmes.This research found that the content of the proposed framework comprises 63 pairs of characteristic-indicators complemented by (a) three important factors of effective seismic risk management of non-engineered buildings, (b) three guiding principles for sustainable dissemination to the grass root communities and (c) a map of agents of change. Among the 63 pairs, there are 19 technical interventions and 44 non-technical interventions. These findings contribute to the wider knowledge in the domain of the seismic risk management of non-engineered buildings, in order to: (a) provide a basis for effective political advocacy, (b) reflect the multidimensional and inter-disciplinary nature of seismic risk reduction, (c) assist a wide range of users in determining roles, responsibilities, and accountabilities, and (d) provide the basis for setting goals and targets.
Borcherdt, Roger D.
2014-01-01
Proposals are developed to update Tables 11.4-1 and 11.4-2 of Minimum Design Loads for Buildings and Other Structures published as American Society of Civil Engineers Structural Engineering Institute standard 7-10 (ASCE/SEI 7–10). The updates are mean next generation attenuation (NGA) site coefficients inferred directly from the four NGA ground motion prediction equations used to derive the maximum considered earthquake response maps adopted in ASCE/SEI 7–10. Proposals include the recommendation to use straight-line interpolation to infer site coefficients at intermediate values of (average shear velocity to 30-m depth). The NGA coefficients are shown to agree well with adopted site coefficients at low levels of input motion (0.1 g) and those observed from the Loma Prieta earthquake. For higher levels of input motion, the majority of the adopted values are within the 95% epistemic-uncertainty limits implied by the NGA estimates with the exceptions being the mid-period site coefficient, Fv, for site class D and the short-period coefficient, Fa, for site class C, both of which are slightly less than the corresponding 95% limit. The NGA data base shows that the median value of 913 m/s for site class B is more typical than 760 m/s as a value to characterize firm to hard rock sites as the uniform ground condition for future maximum considered earthquake response ground motion estimates. Future updates of NGA ground motion prediction equations can be incorporated easily into future adjustments of adopted site coefficients using procedures presented herein.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1996-05-01
The identification of seismic sources is often based on a combination of geologic and tectonic considerations and patterns of observed seismicity; hence, a historical earthquake catalogue is important. A historical catalogue of earthquakes of approximate magnitude (M) 2.5 and greater for the time period 1850 through 1992 was compiled for the INEL region. The primary data source used was the Decade of North American Geology (DNAG) catalogue for the time period from about 1800 through 1985 (Engdahl and Rinehart, 1988). A large number of felt earthquakes, especially prior to the 1970`s, which were below the threshold of completeness established inmore » the DNAG catalogue (Engdahl and Rinehart, 1991), were taken from the state catalogues compiled by Stover and colleagues at the National Earthquake Information Center (NEIC) and combined with the DNAG catalogue for the INEL region. The state catalogues were those of Idaho, Montana, Nevada, Utah, and Wyoming. NEIC`s Preliminary Determination of Epicenters (PDE) and the state catalogues compiled by the Oregon Department of Geology and Mineral Industries (DOGAMI), and the University of Nevada at Reno (UNR) were also used to supplement the pre-1986 time period. A few events reanalyzed by Jim Zollweg (Boise State University, written communication, 1994) were also modified in the catalogue. In the case of duplicate events, the DNAG entry was preferred over the Stover et al. entry for the period 1850 through 1985. A few events from Berg and Baker (1963) were also added to the catalogue. This information was and will be used in determining the seismic risk of buildings and facilities located at the Idaho National Engineering Laboratory.« less
Bozorgnia, Yousef; Abrahamson, Norman A.; Al Atik, Linda; Ancheta, Timothy D.; Atkinson, Gail M.; Baker, Jack W.; Baltay, Annemarie S.; Boore, David M.; Campbell, Kenneth W.; Chiou, Brian S.J.; Darragh, Robert B.; Day, Steve; Donahue, Jennifer; Graves, Robert W.; Gregor, Nick; Hanks, Thomas C.; Idriss, I. M.; Kamai, Ronnie; Kishida, Tadahiro; Kottke, Albert; Mahin, Stephen A.; Rezaeian, Sanaz; Rowshandel, Badie; Seyhan, Emel; Shahi, Shrey; Shantz, Tom; Silva, Walter; Spudich, Paul A.; Stewart, Jonathan P.; Watson-Lamprey, Jennie; Wooddell, Kathryn; Youngs, Robert
2014-01-01
The NGA-West2 project is a large multidisciplinary, multi-year research program on the Next Generation Attenuation (NGA) models for shallow crustal earthquakes in active tectonic regions. The research project has been coordinated by the Pacific Earthquake Engineering Research Center (PEER), with extensive technical interactions among many individuals and organizations. NGA-West2 addresses several key issues in ground-motion seismic hazard, including updating the NGA database for a magnitude range of 3.0–7.9; updating NGA ground-motion prediction equations (GMPEs) for the “average” horizontal component; scaling response spectra for damping values other than 5%; quantifying the effects of directivity and directionality for horizontal ground motion; resolving discrepancies between the NGA and the National Earthquake Hazards Reduction Program (NEHRP) site amplification factors; analysis of epistemic uncertainty for NGA GMPEs; and developing GMPEs for vertical ground motion. This paper presents an overview of the NGA-West2 research program and its subprojects.
Multicriteria decision model for retrofitting existing buildings
NASA Astrophysics Data System (ADS)
Bostenaru Dan, B.
2003-04-01
In this paper a model to decide which buildings from an urban area should be retrofitted is presented. The model has been cast into existing ones by choosing the decision rule, criterion weighting and decision support system types most suitable for the spatial problem of reducing earthquake risk in urban areas, considering existing spatial multiatributive and multiobjective decision methods and especially collaborative issues. Due to the participative character of the group decision problem "retrofitting existing buildings" the decision making model is based on interactivity. Buildings have been modeled following the criteria of spatial decision support systems. This includes identifying the corresponding spatial elements of buildings according to the information needs of actors from different sphaeres like architects, construction engineers and economists. The decision model aims to facilitate collaboration between this actors. The way of setting priorities interactivelly will be shown, by detailing the two phases: judgemental and computational, in this case site analysis, collection and evaluation of the unmodified data and converting survey data to information with computational methods using additional expert support. Buildings have been divided into spatial elements which are characteristic for the survey, present typical damages in case of an earthquake and are decisive for a better seismic behaviour in case of retrofitting. The paper describes the architectural and engineering characteristics as well as the structural damage for constuctions of different building ages on the example of building types in Bucharest, Romania in compressible and interdependent charts, based on field observation, reports from the 1977 earthquake and detailed studies made by the author together with a local engineer for the EERI Web Housing Encyclopedia. On this base criteria for setting priorities flow into the expert information contained in the system.
A risk-mitigation approach to the management of induced seismicity.
Bommer, Julian J; Crowley, Helen; Pinho, Rui
2015-01-01
Earthquakes may be induced by a wide range of anthropogenic activities such as mining, fluid injection and extraction, and hydraulic fracturing. In recent years, the increased occurrence of induced seismicity and the impact of some of these earthquakes on the built environment have heightened both public concern and regulatory scrutiny, motivating the need for a framework for the management of induced seismicity. Efforts to develop systems to enable control of seismicity have not yet resulted in solutions that can be applied with confidence in most cases. The more rational approach proposed herein is based on applying the same risk quantification and mitigation measures that are applied to the hazard from natural seismicity. This framework allows informed decision-making regarding the conduct of anthropogenic activities that may cause earthquakes. The consequent risk, if related to non-structural damage (when re-location is not an option), can be addressed by appropriate financial compensation. If the risk poses a threat to life and limb, then it may be reduced through the application of strengthening measures in the built environment-the cost of which can be balanced against the economic benefits of the activity in question-rather than attempting to ensure that some threshold on earthquake magnitude or ground-shaking amplitude is not exceeded. However, because of the specific characteristics of induced earthquakes-which may occur in regions with little or no natural seismicity-the procedures used in standard earthquake engineering need adaptation and modification for application to induced seismicity.
The ShakeOut Earthquake Scenario - A Story That Southern Californians Are Writing
Perry, Suzanne; Cox, Dale; Jones, Lucile; Bernknopf, Richard; Goltz, James; Hudnut, Kenneth; Mileti, Dennis; Ponti, Daniel; Porter, Keith; Reichle, Michael; Seligson, Hope; Shoaf, Kimberley; Treiman, Jerry; Wein, Anne
2008-01-01
The question is not if but when southern California will be hit by a major earthquake - one so damaging that it will permanently change lives and livelihoods in the region. How severe the changes will be depends on the actions that individuals, schools, businesses, organizations, communities, and governments take to get ready. To help prepare for this event, scientists of the U.S. Geological Survey (USGS) have changed the way that earthquake scenarios are done, uniting a multidisciplinary team that spans an unprecedented number of specialties. The team includes the California Geological Survey, Southern California Earthquake Center, and nearly 200 other partners in government, academia, emergency response, and industry, working to understand the long-term impacts of an enormous earthquake on the complicated social and economic interactions that sustain southern California society. This project, the ShakeOut Scenario, has applied the best current scientific understanding to identify what can be done now to avoid an earthquake catastrophe. More information on the science behind this project will be available in The ShakeOut Scenario (USGS Open-File Report 2008-1150; http://pubs.usgs.gov/of/2008/1150/). The 'what if?' earthquake modeled in the ShakeOut Scenario is a magnitude 7.8 on the southern San Andreas Fault. Geologists selected the details of this hypothetical earthquake by considering the amount of stored strain on that part of the fault with the greatest risk of imminent rupture. From this, seismologists and computer scientists modeled the ground shaking that would occur in this earthquake. Engineers and other professionals used the shaking to produce a realistic picture of this earthquake's damage to buildings, roads, pipelines, and other infrastructure. From these damages, social scientists projected casualties, emergency response, and the impact of the scenario earthquake on southern California's economy and society. The earthquake, its damages, and resulting losses are one realistic outcome, deliberately not a worst-case scenario, rather one worth preparing for and mitigating against. Decades of improving the life-safety requirements in building codes have greatly reduced the risk of death in earthquakes, yet southern California's economic and social systems are still vulnerable to large-scale disruptions. Because of this, the ShakeOut Scenario earthquake would dramatically alter the nature of the southern California community. Fortunately, steps can be taken now that can change that outcome and repay any costs many times over. The ShakeOut Scenario is the first public product of the USGS Multi-Hazards Demonstration Project, created to show how hazards science can increase a community's resiliency to natural disasters through improved planning, mitigation, and response.
2011-02-16
ISS026-E-027391 (16 Feb. 2011) --- Russian cosmonaut Dmitry Kondratyev, Expedition 26 flight engineer, wearing a Russian Orlan-MK spacesuit, participates in a session of extravehicular activity (EVA) focused on the installation of two scientific experiments outside the Zvezda Service Module of the International Space Station. During the four-hour, 51-minute spacewalk, Kondratyev and Russian cosmonaut Oleg Skripochka (out of frame), flight engineer, installed a pair of earthquake and lightning sensing experiments and retrieved a pair of spacecraft material evaluation panels.
2011-02-16
ISS026-E-027361 (16 Feb. 2011) --- Russian cosmonaut Dmitry Kondratyev, Expedition 26 flight engineer, wearing a Russian Orlan-MK spacesuit, participates in a session of extravehicular activity (EVA) focused on the installation of two scientific experiments outside the Zvezda Service Module of the International Space Station. During the four-hour, 51-minute spacewalk, Kondratyev and Russian cosmonaut Oleg Skripochka (out of frame), flight engineer, installed a pair of earthquake and lightning sensing experiments and retrieved a pair of spacecraft material evaluation panels.
2011-02-16
ISS026-E-027368 (16 Feb. 2011) --- Russian cosmonaut Dmitry Kondratyev, Expedition 26 flight engineer, wearing a Russian Orlan-MK spacesuit, participates in a session of extravehicular activity (EVA) focused on the installation of two scientific experiments outside the Zvezda Service Module of the International Space Station. During the four-hour, 51-minute spacewalk, Kondratyev and Russian cosmonaut Oleg Skripochka (out of frame), flight engineer, installed a pair of earthquake and lightning sensing experiments and retrieved a pair of spacecraft material evaluation panels.
NASA Astrophysics Data System (ADS)
Vessia, Giovanna; Parise, Mario
2013-04-01
Landslide susceptibility and hazard are commonly developed by means of GIS (Geographic Information Systems) tools. Many products such as DTM (Digital Terrain Models), and geological, morphological and lithological layers (often, to be downloaded for free and integrated within GIS) are nowadays available on the web and ready to be used for urban planning purposes. The multiple sources of public information enable the local authorities to use these products for predicting hazards within urban territories by limited investments on technological infrastructures. On the contrary, the necessary expertise required for conducting pertinent hazard analyses is high, and rarely available at the level of the local authorities. In this respect, taking into account the production of seismically-induced landslide hazard maps at regional scale drawn by GIS tool, these can be performed according to the permanent displacement approach derived by Newmark's sliding block method (Newmark, 1965). Some simplified assumptions are considered for occurrence of a seismic mass movement, listed as follows: (1) the Mohr-Coulomb criterion is used for the plastic displacement of the rigid block; (2) only downward movements are accounted for; (3) a translative sliding mechanism is assumed. Under such conditions, several expressions have been proposed for predicting permanent displacements of slopes during seismic events (Ambresys and Menu, 1988; Luzi and Pergalani 2000; Romeo 2000; Jibson 2007, among the others). These formulations have been provided by researchers for different ranges of seismic magnitudes, and for indexes describing the seismic action, such as peak ground acceleration, peak ground velocity, Arias Intensity, and damage potential. With respect to the resistant properties of the rock units, the critical acceleration is the relevant strength variable in every expressions; it is a function of local slope, groundwater level, unit weight shear resistance of the surficial sediments, and the assumed depth of the sliding surface. Thus, it is of paramount relevance to correctly understand and describe the dynamic behavior of the lithologies affected by the earthquake. Accordingly, we put here in evidence some critical points in the application of the permanent displacement formulations by considering the case study of Santa Susana Mountains (California, USA) shaken by the Northridge earthquake in 1994. During this earthquake, a high number of registrations has been collected, whilst soon after a careful inventory of the mass movements triggered by the shaking has been produced, together with analysis of the related failure mechanisms. Hence, these data allow to perform a back analysis in order to verify the reliability of some numerical expressions, such as those proposed by Ambraseys and Menu (1988), Romeo (2000), and Jibson (2007), with respect to the possible dynamic behavior of the lithologies affected by landslides. In this sector of California, the following lithologies crop out, that were involved in shallow landslides: (1) Quaternay deposits; (2) Saugus Formation; (3) Towsley Formation; (4) Pico Formation; (5) Topanga Formation; (6) Modelo Formation; (7) Simi Conglomerate; (8) Santa Susana Formation; (9) Llajas and Chatsworth Formations. The surveys carried out after the Northridge earthquake (Harp and Jibson, 1995), and the analysis of landslide distribution (Parise and Jibson 2000) pointed out that the strongest formations with slopes higher than 50° mainly suffered toppling or fall failures: thus, our hazard maps based on permanent displacements did not take into account such range of slopes. Further, areas with slopes lower than 10° were not affected by relevant mass movements. Thus, a limited range of slopes (between 10° and 45°) was considered in the analyses, with depth of the sliding surface varying between 1 and 3 m, and using the resistance parameters of involved lithologies obtained from in situ and laboratory tests performed by local practitioners. Seismically-induced landslide hazard maps have been drawn using the aforementioned three expressions. The preliminary results show Quaternary deposits (including alluvium deposits, slope wash, and terrace deposits) as the lithologies most affected by permanent displacement. Moreover, Towsley and Modelo formations, that are stiffer than the previous rock units, and consist mostly of shales, siltstones and subordinate sandstones, show high hazard value where the slopes increase. The relevant role of local slope in permanent displacement extent is evident where lithologies are characterized by both cohesive and frictional resistance components. Finally, a comparison among the maps produced by using the three expressions for permanent displacements is discussed. References Ambraseys N.N. and Menu J.M. (1988) Earthquake-induced ground displacements. Earthquake Engineering and Structural Dynamics, 16: 985-1006. Harp E.L. and Jibson R.W. (1995) Inventory of landslides triggered by the 1994 Northridge, California earthquake. US Geol. Surv. Open-File Rep. 95-213 17 pp. Jibson R. (2007) Regression models for estimating coseismic landslide displacement. Engineering Geology, 91: 209-218. Luzi L. and Pergalani F. (2000) A correlation between slope failures and accelerometric parameters: the 26 September 1997 earthquake (Umbria-Marche, Italy). Soil Dynamics and Earthquake Engineering, 20: 301-313. Newmark N.M. (1965) Effects of earthquakes on dams and embankments. Geotechnique 965, 15(2): 139-160. Parise M. and Jibson R.W. (2000) A seismic landslide susceptibility rating of geologic units based on analysis of characteristics of landslides triggered by the 17 January, 1994 Northridge, California earthquake. Engineering Geology, 58: 251-270. Romeo R. (2000) Seismically induced landslide displacements: a predictive model. Engineering Geology, 58: 337-351.
10 CFR 50.54 - Conditions of licenses.
Code of Federal Regulations, 2012 CFR
2012-01-01
...)(1) Each nuclear power plant or fuel reprocessing plant licensee subject to the quality assurance... irradiated fuel. (ff) For licensees of nuclear power plants that have implemented the earthquake engineering... of rated thermal power only if the Commission finds that the state of onsite emergency preparedness...
10 CFR 50.54 - Conditions of licenses.
Code of Federal Regulations, 2013 CFR
2013-01-01
...)(1) Each nuclear power plant or fuel reprocessing plant licensee subject to the quality assurance... irradiated fuel. (ff) For licensees of nuclear power plants that have implemented the earthquake engineering... of rated thermal power only if the Commission finds that the state of onsite emergency preparedness...
Skip to content HOME NEWS USERS OpenFrescoExpress OpenFresco Examples & Tools Feedback staff and research students learning about hybrid simulation and starting to use this experimental the Pacific Earthquake Engineering Research Center (PEER) and others. Search Search for: Search Menu
Report on progress at the Center for Engineering Strong Motion Data (CESMD)
Haddadi, H.; Shakal, A.; Huang, M.; Parrish, J.; Stephens, C.; Savage, William U.; Leith, William S.
2012-01-01
The CESMD now provides strong-motion records from lower magnitude (
NASA Astrophysics Data System (ADS)
2011-12-01
Jacobo Bielak, university professor of civil and environmental engineering at Carnegie Mellon University, in Pittsburgh, Pa., has been recognized as a distinguished member of the American Society of Civil Engineers, the highest recognition the organization confers. Bielak was noted as “an internationally-known researcher in the area of structural responses to earthquakes, developing sophisticated numerical simulations to pinpoint earthquake effects.” Alan Strahler, professor of geography and environment at Boston University, Boston, Mass., received a 2011 William T. Pecora Award for his achievements in Earth remote sensing. The award, presented by NASA and the U.S. Department of the Interior on 15 November, recognized Strahler for “his contributions to remote-sensing science, leadership and education, which have improved the fundamental understanding of the remote-sensing process and its applications for observing land surface properties.” The Pecora award is named for the former director of the U.S. Geological Survey and undersecretary of the Interior department, who was influential in the establishment of the Landsat satellite program.
Teamwork tools and activities within the hazard component of the Global Earthquake Model
NASA Astrophysics Data System (ADS)
Pagani, M.; Weatherill, G.; Monelli, D.; Danciu, L.
2013-05-01
The Global Earthquake Model (GEM) is a public-private partnership aimed at supporting and fostering a global community of scientists and engineers working in the fields of seismic hazard and risk assessment. In the hazard sector, in particular, GEM recognizes the importance of local ownership and leadership in the creation of seismic hazard models. For this reason, over the last few years, GEM has been promoting different activities in the context of seismic hazard analysis ranging, for example, from regional projects targeted at the creation of updated seismic hazard studies to the development of a new open-source seismic hazard and risk calculation software called OpenQuake-engine (http://globalquakemodel.org). In this communication we'll provide a tour of the various activities completed, such as the new ISC-GEM Global Instrumental Catalogue, and of currently on-going initiatives like the creation of a suite of tools for the creation of PSHA input models. Discussion, comments and criticism by the colleagues in the audience will be highly appreciated.
Bawden, Gerald W.; Bond, Sandra; Podoski, J. H.; Kreylos, O.; Kellogg, L. H.
2012-01-01
We used ground-based Tripod LiDAR (T-LiDAR) to assess the stability of two engineered structures: a bridge spanning the San Andreas fault following the M6.0 Parkfield earthquake in Central California and a newly built coastal breakwater located at the Kaumālapa`u Harbor Lana'i, Hawaii. In the 10 weeks following the earthquake, we found that the surface under the bridge shifted 7.1 cm with an additional 2.6 cm of motion in the subsequent 13 weeks, which deflected the bridge's northern I-beam support 4.3 cm and 2.1 respectively; the bridge integrity remained intact. T-LiDAR imagery was collected after the completion of armored breakwater with 817 35-ton interlocking concrete armor units, Core-Locs®, in the summers of 2007, 2008 and 2010. We found a wide range of motion of individual Core-Locs, from a few centimeters to >110 cm along the ocean side of the breakwater, with lesser movement along the harbor side.
Dewey, James W.; Hopper, Margaret G.; Wald, David J.; Quitoriano, Vincent; Adams, Elizabeth R.
2002-01-01
We present isoseismal maps, macroseismic intensities, and community summaries of damage for the MW=6.8 Nisqually, Washington, earthquake of 28 February, 2001. For many communities, two types of macroseismic intensity are assigned, the traditional U.S. Geological Survey Modified Mercalli Intensities (USGS MMI) and a type of intensity newly introduced with this paper, the USGS Reviewed Community Internet Intensity (RCII). For most communities, the RCII is a reviewed version of the Community Internet Intensity (CII) of Wald and others (1999). For some communities, RCII is assigned from such non-CII sources as press reports, engineering reports, and field reconnaissance observations. We summarize differences between procedures used to assign RCII and USGS MMI, and we show that the two types of intensity are nonetheless very similar for the Nisqually earthquake. We do not see evidence for systematic differences between RCII and USGS MMI that would approach one intensity unit, at any level of shaking, but we document a tendency for the RCII to be slightly lower than MMI in regions of low intensity and slightly higher than MMI in regions of high intensity. The highest RCII calculated for the Nisqually earthquake is 7.6, calculated for zip code 98134, which includes the ?south of downtown? (Sodo) area of Seattle and Harbor Island. By comparison, we assigned a traditional USGS MMI 8 to the Sodo area of Seattle. In all, RCII of 6.5 and higher were assigned to 58 zip-code regions. At the lowest intensities, the Nisqually earthquake was felt over an area of approximately 350,000 square km (approximately 135,000 square miles) in Washington, Oregon, Idaho, Montana, and southern British Columbia, Canada. On the basis of macroseismic effects, we infer that shaking in the southern Puget Sound region was somewhat less for the 2001 Nisqually earthquake than for the Puget Sound earthquake of April 13, 1949, which had nearly the same hypocenter and magnitude. Allowing for differences in hypocenter, shaking in the 2001 earthquake was very similar to that produced by the Puget Sound earthquake of April 25, 1965. First-person accounts of the effects of the 2001 earthquake on individual households are given for some communities.
NASA Astrophysics Data System (ADS)
Simila, G.; McNally, K.; Quintero, R.; Segura, J.
2006-12-01
The seismic strong motion array project (SSMAP) for the Nicoya Peninsula in northwestern Costa Rica is composed of 10 13 sites including Geotech A900/A800 accelerographs (three-component), Ref-Teks (three- component velocity), and Kinemetric Episensors. The main objectives of the array are to: 1) record and locate strong subduction zone mainshocks [and foreshocks, "early aftershocks", and preshocks] in Nicoya Peninsula, at the entrance of the Nicoya Gulf, and in the Papagayo Gulf regions of Costa Rica, and 2) record and locate any moderate to strong upper plate earthquakes triggered by a large subduction zone earthquake in the above regions. Our digital accelerograph array has been deployed as part of our ongoing research on large earthquakes in conjunction with the Earthquake and Volcano Observatory (OVSICORI) at the Universidad Nacional in Costa Rica. The country wide seismographic network has been operating continuously since the 1980's, with the first earthquake bulletin published more than 20 years ago, in 1984. The recording of seismicity and strong motion data for large earthquakes along the Middle America Trench (MAT) has been a major research project priority over these years, and this network spans nearly half the time of a "repeat cycle" (50 years) for large (Ms 7.5- 7.7) earthquakes beneath the Nicoya Peninsula, with the last event in 1950. Our long time co-collaborators include the seismology group OVSICORI, with coordination for this project by Dr. Ronnie Quintero and Mr. Juan Segura. Numerous international investigators are also studying this region with GPS and seismic stations (US, Japan, Germany, Switzerland, etc.). Also, there are various strong motion instruments operated by local engineers, for building purposes and mainly concentrated in the population centers of the Central Valley. The major goal of our project is to contribute unique scientific information pertaining to a large subduction zone earthquake and its related seismic activity when the next large earthquake occurs in Nicoya. A centralized data base will be created within the main seismic network files at OVSICORI, with various local personnel working in teams that will be responsible to collect data within 3 days following a large mainshock.
NASA Astrophysics Data System (ADS)
Ree, J. H.; Kim, S.; Yoon, H. S.; Choi, B. K.; Park, P. H.
2017-12-01
The GPS-determined, pre-, co- and post-seismic crustal deformations of the Korean peninsula with respect to the 2011 Tohoku-Oki earthquake (Baek et al., 2012, Terra Nova; Kim et al., 2015, KSCE Jour. of Civil Engineering) are all stretching ones (extensional; horizontal stretching rate larger than horizontal shortening rate). However, focal mechanism solutions of earthquakes indicate that South Korea has been at compressional regime dominated by strike- and reverse-slip faultings. We reevaluated the velocity field of GPS data to see any effect of the Tohoku-Oki earthquake on the Korean crustal deformation and seismicity. To calculate the velocity gradient tensor of GPS sites, we used a gridding method based on least-square collocation (LSC). This LSC method can overcome shortcomings of the segmentation methods including the triangulation method. For example, an undesirable, abrupt change in components of velocity field occurs at segment boundaries in the segmentation methods. It is also known that LSC method is more useful in evaluating deformation patterns in intraplate areas with relatively small displacements. Velocity vectors of South Korea, pointing in general to 113° before the Tohoku-Oki earthquake, instantly changed their direction toward the epicenter (82° on average) during the Tohoku-Oki earthquake, and then gradually returned to the original position about 2 years after the Tohoku-Oki earthquake. Our calculation of velocity gradient tensors after the Tohoku-Oki earthquake shows that the stretching and rotating fields are quite heterogeneous, and that both stretching and shortening areas exist in South Korea. In particular, after the post-seismic relaxation ceased (i.e., from two years after the Tohoku-Oki earthquake), regions with thicker and thinner crusts tend to be shortening and stretching, respectively, in South Korea. Furthermore, the straining rate is larger in the regions with thinner crust. Although there is no meaningful correlation between seismicity and crustal straining pattern of South Korea at present, the seismicity tends to be localized along boundaries between areas with opposite vorticity, particularly for velocity field for one year after the Tohoku-Oki earthquake.
NASA Astrophysics Data System (ADS)
Li, X.; Gao, M.
2017-12-01
The magnitude of an earthquake is one of its basic parameters and is a measure of its scale. It plays a significant role in seismology and earthquake engineering research, particularly in the calculations of the seismic rate and b value in earthquake prediction and seismic hazard analysis. However, several current types of magnitudes used in seismology research, such as local magnitude (ML), surface wave magnitude (MS), and body-wave magnitude (MB), have a common limitation, which is the magnitude saturation phenomenon. Fortunately, the problem of magnitude saturation was solved by a formula for calculating the seismic moment magnitude (MW) based on the seismic moment, which describes the seismic source strength. Now the moment magnitude is very commonly used in seismology research. However, in China, the earthquake scale is primarily based on local and surface-wave magnitudes. In the present work, we studied the empirical relationships between moment magnitude (MW) and local magnitude (ML) as well as surface wave magnitude (MS) in the Chinese Mainland. The China Earthquake Networks Center (CENC) ML catalog, China Seismograph Network (CSN) MS catalog, ANSS Comprehensive Earthquake Catalog (ComCat), and Global Centroid Moment Tensor (GCMT) are adopted to regress the relationships using the orthogonal regression method. The obtained relationships are as follows: MW=0.64+0.87MS; MW=1.16+0.75ML. Therefore, in China, if the moment magnitude of an earthquake is not reported by any agency in the world, we can use the equations mentioned above for converting ML to MW and MS to MW. These relationships are very important, because they will allow the China earthquake catalogs to be used more effectively for seismic hazard analysis, earthquake prediction, and other seismology research. We also computed the relationships of and (where Mo is the seismic moment) by linear regression using the Global Centroid Moment Tensor. The obtained relationships are as follows: logMo=18.21+1.05ML; logMo=17.04+1.32MS. This formula can be used by seismologists to convert the ML/MS of Chinese mainland events into their seismic moments.
NASA Astrophysics Data System (ADS)
Haddad, David Elias
Earth's topographic surface forms an interface across which the geodynamic and geomorphic engines interact. This interaction is best observed along crustal margins where topography is created by active faulting and sculpted by geomorphic processes. Crustal deformation manifests as earthquakes at centennial to millennial timescales. Given that nearly half of Earth's human population lives along active fault zones, a quantitative understanding of the mechanics of earthquakes and faulting is necessary to build accurate earthquake forecasts. My research relies on the quantitative documentation of the geomorphic expression of large earthquakes and the physical processes that control their spatiotemporal distributions. The first part of my research uses high-resolution topographic lidar data to quantitatively document the geomorphic expression of historic and prehistoric large earthquakes. Lidar data allow for enhanced visualization and reconstruction of structures and stratigraphy exposed by paleoseismic trenches. Lidar surveys of fault scarps formed by the 1992 Landers earthquake document the centimeter-scale erosional landforms developed by repeated winter storm-driven erosion. The second part of my research employs a quasi-static numerical earthquake simulator to explore the effects of fault roughness, friction, and structural complexities on earthquake-generated deformation. My experiments show that fault roughness plays a critical role in determining fault-to-fault rupture jumping probabilities. These results corroborate the accepted 3-5 km rupture jumping distance for smooth faults. However, my simulations show that the rupture jumping threshold distance is highly variable for rough faults due to heterogeneous elastic strain energies. Furthermore, fault roughness controls spatiotemporal variations in slip rates such that rough faults exhibit lower slip rates relative to their smooth counterparts. The central implication of these results lies in guiding the interpretation of paleoseismically derived slip rates that are used to form earthquake forecasts. The final part of my research evaluates a set of Earth science-themed lesson plans that I designed for elementary-level learning-disabled students. My findings show that a combination of concept delivery techniques is most effective for learning-disabled students and should incorporate interactive slide presentations, tactile manipulatives, teacher-assisted concept sketches, and student-led teaching to help learning-disabled students grasp Earth science concepts.
NASA Astrophysics Data System (ADS)
Kamil, P. I.; Pratama, A. J.; Hidayatulloh, A.
2016-05-01
Social media has been part of our daily life for years, and now it has become a treasure trove of data for social scientists to mine. Using our own data mining engine we downloaded 1500 Instagram posts related to the Nepal earthquake in April 2015, a disaster which caused tremendous losses counted in human lives and infrastructures. We predicted that the social media will be a place where people respond and express themselves emotionally in light of a disaster of such massive scale, a "megadeath" event. We ended up with data on 1017 posts tracked with the hashtag #prayfornepal, consisting of the post's date, time, geolocation, image, post ID, username and ID, caption, and hashtag. We categorized the posts into 7 categories and found that most of the photos (30,29%) are related to Nepal but not directly related to the disasters, which reflects death imprint, one of psychosocial responses after a megadeath event. Other analyses were done to compare each photo category, including geo-location, hashtag network and caption network which will be visualized with ArcGIS, NodeXL, Gephi, and our own word cloud engine to examine other digital reactions to Nepal Earthquake in Instagram. This study can give an overview of how community reacts to a disaster in digital world and utilize it for disaster response and awareness.
Extraction of crustal deformations and oceanic fluctuations from ocean bottom pressures
NASA Astrophysics Data System (ADS)
Ariyoshi, Keisuke; Nagano, Akira; Hasegawa, Takuya; Matsumoto, Hiroyuki; Kido, Motoyuki; Igarashi, Toshihiro; Uchida, Naoki; Iinuma, Takeshi; Yamashita, Yusuke
2017-04-01
It has been well known that megathrust earthquakes such as the 2004 Sumatra-Andaman Earthquake (Mw 9.1) and the 2011 the Pacific Coast of Tohoku Earthquake (Mw 9.0) had devastated the coastal areas in the western of Indonesia and in the north-eastern of Japan, respectively. To mitigate the disaster of those forthcoming megathrust earthquakes along Nankai Trough, the Japanese government has established seafloor networks of cable-linked observatories around Japan: DONET (Dense Oceanfloor Network system for Earthquakes and Tsunamis along the Nankai Trough) and S-net (Seafloor Observation Network for Earthquakes and Tsunamis along the Japan Trench). The advantage of the cable-linked network is to monitor the propagation process of tsunami and seismic waves as well as seismic activity in real time. DONET contains pressure gauges as well as seismometers, which are expected to detect crustal deformations driven by peeling off subduction plate coupling process. From our simulation results, leveling changes are different sense among the DONET points even in the same science node. On the other hand, oceanic fluctuations such as melting ice masses through the global warming have so large scale as to cause ocean bottom pressure change coherently for all of DONET points especially in the same node. This difference suggests the possibility of extracting crustal deformations component from ocean bottom pressure data by differential of stacking data. However, this operation cannot be applied to local-scale fluctuations related to ocean mesoscale eddies and current fluctuations, which affect ocean bottom pressure through water density changes in the water column (from the sea surface to the bottom). Therefore, we need integral analysis by combining seismology, ocean physics and tsunami engineering so as to decompose into crustal deformation, oceanic fluctuations and instrumental drift, which will bring about high precision data enough to find geophysical phenomena. In this study, we propose a new interpretation of seismic plate coupling around the Tonankai region along the Nankai Trough, and discuss how to detect it by using the DONET data effectively. In the future, we have to extract the crustal deformation component by separating other components such as instrumental drift and oceanic changes as an integral study collaborated by seismology, geodesy, physical oceanography, and mechanical engineering.
Borcherdt, Rodger D.; Johnston, Malcolm J.S.; Dietel, Christopher; Glassmoyer, Gary; Myren, Doug; Stephens, Christopher
2004-01-01
An integrated array of 11 General Earthquake Observation System (GEOS) stations installed near Parkfield, CA provided on scale broad-band, wide-dynamic measurements of acceleration and volumetric strain of the Parkfield earthquake (M 6.0) of September 28, 2004. Three component measurements of acceleration were obtained at each of the stations. Measurements of collocated acceleration and volumetric strain were obtained at four of the stations. Measurements of velocity at most sites were on scale only for the initial P-wave arrival. When considered in the context of the extensive set of strong-motion recordings obtained on more than 40 analog stations by the California Strong-Motion Instrumentation Program (Shakal, et al., 2004 http://www.quake.ca.gov/cisn-edc) and those on the dense array of Spudich, et al, (1988), these recordings provide an unprecedented document of the nature of the near source strong motion generated by a M 6.0 earthquake. The data set reported herein provides the most extensive set of near field broad band wide dynamic range measurements of acceleration and volumetric strain for an earthquake as large as M 6 of which the authors are aware. As a result considerable interest has been expressed in these data. This report is intended to describe the data and facilitate its use to resolve a number of scientific and engineering questions concerning earthquake rupture processes and resultant near field motions and strains. This report provides a description of the array, its scientific objectives and the strong-motion recordings obtained of the main shock. The report provides copies of the uncorrected and corrected data. Copies of the inferred velocities, displacements, and Psuedo velocity response spectra are provided. Digital versions of these recordings are accessible with information available through the internet at several locations: the National Strong-Motion Program web site (http://agram.wr.usgs.gov/), the COSMOS Virtual Data Center Web site (http://www.cosmos-eq.org), and the CISN Engineering and Berkeley data centers (http://www.quake.ca.gov/cisn-edc). They are also accessible together with recordings on the GEOS Strong-motion Array near Parkfield, CA since its installation in 1987 through the USGS GEOS web site ( http://nsmp.wr.usgs.gov/GEOS).
USGS Training in Afghanistan: Modern Earthquake Hazards Assessments
NASA Astrophysics Data System (ADS)
Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.
2007-05-01
Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."
The 7.9 Denali Fault Earthquake: Damage to Structures and Lifelines
NASA Astrophysics Data System (ADS)
Cox, T.; Hreinsdöttir, S.; Larsen, C.; Estes, S.
2002-12-01
In the early afternoon of Sunday, November 3rd, the residents of many interior Alaska towns were shaken up by a magnitude 7.9 earthquake. The shaking lasted an average of three minutes and when it stopped, nearly 300 km of the Denali Fault had ruptured. In the hours that followed, the Alaska Earthquake Information Center (AEIC) fielded reports of structural damage from Cantwell to Tok and other earthquake effects as far away as Louisiana. Upon investigation, the most severe effects were found in the village of Mentasta where basic utilities were interrupted and the school and several houses suffered major damage. Almost 3000 reports submitted to a community internet intensity map show a maximum Mercalli intensity VIII along the eastern end of the rupture area. The Richardson and Parks Highways, two main north-south thoroughfares in Alaska, both buckled and split as a result of the fault rupture. Traffic was stopped for a few hours while repairs were made. Between the Richardson Highway the Tok Cutoff, a section of the Glenn Highway that connects Tok and Glennallen, the maximum offsets on the Denali Fault were observed. Designed to withstand a magnitude 8.5 earthquake at the Denali Fault crossing, the 800-mile long Trans-Alaska Pipeline suffered relatively minor damage. According to Alyeska Pipeline Service Company press releases, the pipeline was shut down shortly after the earthquake occurred. Repairs to pipeline supports and engineering evaluations began immediately thereafter, and oil began flowing through the pipeline Thursday, November 7th . Through it all, the AEIC has collected and archived many photographs, emails, and eyewitness accounts of those who experienced the destruction firsthand. We will detail the effects that the M7.9 Denali Fault earthquake had from near and far.
The SCEC Broadband Platform: Open-Source Software for Strong Ground Motion Simulation and Validation
NASA Astrophysics Data System (ADS)
Silva, F.; Goulet, C. A.; Maechling, P. J.; Callaghan, S.; Jordan, T. H.
2016-12-01
The Southern California Earthquake Center (SCEC) Broadband Platform (BBP) is a carefully integrated collection of open-source scientific software programs that can simulate broadband (0-100 Hz) ground motions for earthquakes at regional scales. The BBP can run earthquake rupture and wave propagation modeling software to simulate ground motions for well-observed historical earthquakes and to quantify how well the simulated broadband seismograms match the observed seismograms. The BBP can also run simulations for hypothetical earthquakes. In this case, users input an earthquake location and magnitude description, a list of station locations, and a 1D velocity model for the region of interest, and the BBP software then calculates ground motions for the specified stations. The BBP scientific software modules implement kinematic rupture generation, low- and high-frequency seismogram synthesis using wave propagation through 1D layered velocity structures, several ground motion intensity measure calculations, and various ground motion goodness-of-fit tools. These modules are integrated into a software system that provides user-defined, repeatable, calculation of ground-motion seismograms, using multiple alternative ground motion simulation methods, and software utilities to generate tables, plots, and maps. The BBP has been developed over the last five years in a collaborative project involving geoscientists, earthquake engineers, graduate students, and SCEC scientific software developers. The SCEC BBP software released in 2016 can be compiled and run on recent Linux and Mac OS X systems with GNU compilers. It includes five simulation methods, seven simulation regions covering California, Japan, and Eastern North America, and the ability to compare simulation results against empirical ground motion models (aka GMPEs). The latest version includes updated ground motion simulation methods, a suite of new validation metrics and a simplified command line user interface.
Seismic waves increase permeability.
Elkhoury, Jean E; Brodsky, Emily E; Agnew, Duncan C
2006-06-29
Earthquakes have been observed to affect hydrological systems in a variety of ways--water well levels can change dramatically, streams can become fuller and spring discharges can increase at the time of earthquakes. Distant earthquakes may even increase the permeability in faults. Most of these hydrological observations can be explained by some form of permeability increase. Here we use the response of water well levels to solid Earth tides to measure permeability over a 20-year period. At the time of each of seven earthquakes in Southern California, we observe transient changes of up to 24 degrees in the phase of the water level response to the dilatational volumetric strain of the semidiurnal tidal components of wells at the Piñon Flat Observatory in Southern California. After the earthquakes, the phase gradually returns to the background value at a rate of less than 0.1 degrees per day. We use a model of axisymmetric flow driven by an imposed head oscillation through a single, laterally extensive, confined, homogeneous and isotropic aquifer to relate the phase response to aquifer properties. We interpret the changes in phase response as due to changes in permeability. At the time of the earthquakes, the permeability at the site increases by a factor as high as three. The permeability increase depends roughly linearly on the amplitude of seismic-wave peak ground velocity in the range of 0.21-2.1 cm s(-1). Such permeability increases are of interest to hydrologists and oil reservoir engineers as they affect fluid flow and might determine long-term evolution of hydrological and oil-bearing systems. They may also be interesting to seismologists, as the resulting pore pressure changes can affect earthquakes by changing normal stresses on faults.
Google earth mapping of damage from the Nigata-Ken-Chuetsu M6.6 earthquake of 16 July 2007
Kayen, Robert E.; Steele, WM. Clint; Collins, Brian; Walker, Kevin
2008-01-01
We describe the use of Google Earth during and after a large damaging earthquake thatstruck the central Japan coast on 16 July 2007 to collect and organize damage information and guide the reconnaissance activities. This software enabled greater real-time collaboration among scientists and engineers. After the field investigation, the Google Earth map is used as a final reporting product that was directly linked to the more traditional research report document. Finally, we analyze the use of the software within the context of a post-disaster reconnaissance investigation, and link it to student use of GoogleEarth in field situations
San Mateo County Geographic Information Systems (GIS) project
Brabb, E.E.
1986-01-01
Earthquakes and ground failures in the United States cause billions of dollars of damages each year, but techniques for predicting and reducing these hazardous geologic processes remain elusive. geologists, geophysicists, hydrologists, engineers, cartographers, and computer specialists from the U.S geological Survey in Menlo Park, California, are working together on a project involving GIS techniques to determine how to predict the consequences of earthquakes and landslides, using San Mateo County as a subject area. Together with members of the Planning and Emergency Serivces Departments of San Mateo County and the Association of Bay Area Governments, They are also determining how to reduce the losses caused by hazards.
Stochastic ground motion simulation
Rezaeian, Sanaz; Xiaodan, Sun; Beer, Michael; Kougioumtzoglou, Ioannis A.; Patelli, Edoardo; Siu-Kui Au, Ivan
2014-01-01
Strong earthquake ground motion records are fundamental in engineering applications. Ground motion time series are used in response-history dynamic analysis of structural or geotechnical systems. In such analysis, the validity of predicted responses depends on the validity of the input excitations. Ground motion records are also used to develop ground motion prediction equations(GMPEs) for intensity measures such as spectral accelerations that are used in response-spectrum dynamic analysis. Despite the thousands of available strong ground motion records, there remains a shortage of records for large-magnitude earthquakes at short distances or in specific regions, as well as records that sample specific combinations of source, path, and site characteristics.
A Study on the Relationship between Disaster and Spectral Intensity
NASA Astrophysics Data System (ADS)
Yeh, Yeong-Tein; Kao, Ching-Yun
2010-05-01
Nowadays, the structural environment is becoming so complicated that an index, which can better assess earthquake damage than the originally defined intensity scale and PGA, is needed. Housner [1] suggested that spectral intensity (SI) can be a risk index of an earthquake. After Housner some earthquake engineers keep on exploring different period range of SI and its application [2-5]. The study of Matsumura [4] shows that SI is a better measure of earthquake intensity for a wide range of frequencies with a good correlation with damage than peak ground acceleration (adequate to structures with shorter natural period) and peak ground velocity (adequate to structures with longer natural period). Recently, Jean [6] investigated earthquake intensity attenuation law and site effect of strong ground motion using earthquake records in Taiwan area. Their results show that SI is a better earthquake damage index than PGA. This study enhanced the SI concept proposed by Jean [6]. The spectral intensity was separated into three periods, short period (acceleration controlled period), medium period (velocity controlled period), and long period (displacement controlled period). The average spectral intensity of short period, medium period, and long period can be an earthquake damage index of low-rise buildings, buildings of medium height, and high-rise buildings. Since average value of a certain data is meaningful when the data has a small variance, the start and end points of the three periods are calculated by statistical method so that the data at each period has minimum variance. Finally, the relationship between disaster and spectral intensity of 1999 Taiwan Chi-Chi earthquake was investigated in this study. [1] Housner, G. W. (1952). "Spectrum intensity of strong-motion earthquakes," in Proc. Sym. Earthq. Blast Eeff. on Stru., EERI, U.C.L.A.. [2] Hidalgo.P. and R. W.Clough (1974). "Earthquake simulator study of a reinforced concrete frame," Report UCB/EERC-74/13, EERC, University of California, Berkeley. [3] Kappos, A. J (1991). "Analytical prediction of the collpase earthquake for R. C. buildings: suggested methodology," Earthq. Eng. Stru. Dyn., 20, 2, pp. 167-176. [4] Matsumura, K. (1992). "On the intensity measure of strong motions related to structural failures," in Proceeding of 10 WCEE, 1, pp. 375-380. [5] Martinez-Rueda, J. E (1998). "Scaling procedure for natural accelerograms based on a system of spectrum intensity scales," Earthq. Spec., 14, 1,. [6] Jean, W. Y., Y. W. Chang, K. L. Wen, and C. H. Loh (2006). "Early estimation of seismic hazard for strong earthquakes in Taiwan," Natural Hazards, vol. 37, pp. 39-53.
Kalantar Motamedi, Mohammad Hosein; Sagafinia, Masoud; Ebrahimi, Ali; Shams, Ehsan; Kalantar Motamedi, Mostafa
2012-01-01
Objectives: This article sought to review and compare data of major earthquakes of the past decade and their aftermath in order to compare the magnitude, death toll, type of injuries, management procedures, extent of destruction and effectiveness of relief efforts. Materials and Methods: A retrospective study of the various aspects of management and aftermath of 5 major earthquakes of the past decade (2000–2010) was undertaken. This included earthquakes occurring in Bam Iran, Sichuan China, Port-au-Prince Haiti, Kashmir Pakistan and Ica Peru. A literature search was done via computer of published articles (indexed in Pubmed). The issues assessed included: 1)Local magnitude,2)Type of building structure 3)Time of the earthquake (day/time/season), 4)Time to rescue, 5)Triage, Transfer, and Treatment 6) Distribution of casualties (dead/ injured), 7)Degree of city damage, 8)Degree of damage to health facilities, 9)Field hospital availability, 10)International aid, 11)Air transfer, 12) Telecommunication systems availability, 13) PTSD prevalence, 14) Most common injury and 15) Most common disease outbreak. Results: The Bam earthquake had the lowest (6.6 Richter’s) and the Sichuan earthquake had the greatest magnitude (8.0 Richter’s). Mortality in Haiti was 212,000 and it was the deadliest earthquake of the past decade. Collapse of heavy clay roofing structures was a major cause of death in Iran and Pakistan. Earthquakes occurring at night and nonworking days carried a high death toll. The time to rescue and treat was the lengthiest in Haiti (possibly contributing to the death to injured ratio). However, the worst dead to injured ratios were in Bam (51%) and in Pakistan (47%); the best ratio was in China (15%). Iran and Pakistan suffered the highest percentage of damage to the health facilities (90%). Field hospital availability, international aid and air transfer were important issues. Telecommunication systems were best in China and worst in Pakistan. PTSD prevalence was highest in Iran. Respiratory infection was the most common infection following all 5 earthquakes. Conclusions: Earthquake damage, death toll, managerial protocols etc. vary in different countries and are influenced by many factors including the hour the earthquake hits and the day of the week. Additionally, social, structural and geographic factors as well as the medical, governmental and NGO respondents are influential. Engineered residential construction remains to be of importance in reducing mortality in developing countries. It is essential that hospitals, fire departments and police stations, water, telephone and electrical facilities be made earthquake proof. PMID:24829886
Kalantar Motamedi, Mohammad Hosein; Sagafinia, Masoud; Ebrahimi, Ali; Shams, Ehsan; Kalantar Motamedi, Mostafa
2012-01-01
This article sought to review and compare data of major earthquakes of the past decade and their aftermath in order to compare the magnitude, death toll, type of injuries, management procedures, extent of destruction and effectiveness of relief efforts. A retrospective study of the various aspects of management and aftermath of 5 major earthquakes of the past decade (2000-2010) was undertaken. This included earthquakes occurring in Bam Iran, Sichuan China, Port-au-Prince Haiti, Kashmir Pakistan and Ica Peru. A literature search was done via computer of published articles (indexed in Pubmed). The issues assessed included: 1)Local magnitude,2)Type of building structure 3)Time of the earthquake (day/time/season), 4)Time to rescue, 5)Triage, Transfer, and Treatment 6) Distribution of casualties (dead/ injured), 7)Degree of city damage, 8)Degree of damage to health facilities, 9)Field hospital availability, 10)International aid, 11)Air transfer, 12) Telecommunication systems availability, 13) PTSD prevalence, 14) Most common injury and 15) Most common disease outbreak. The Bam earthquake had the lowest (6.6 Richter's) and the Sichuan earthquake had the greatest magnitude (8.0 Richter's). Mortality in Haiti was 212,000 and it was the deadliest earthquake of the past decade. Collapse of heavy clay roofing structures was a major cause of death in Iran and Pakistan. Earthquakes occurring at night and nonworking days carried a high death toll. The time to rescue and treat was the lengthiest in Haiti (possibly contributing to the death to injured ratio). However, the worst dead to injured ratios were in Bam (51%) and in Pakistan (47%); the best ratio was in China (15%). Iran and Pakistan suffered the highest percentage of damage to the health facilities (90%). Field hospital availability, international aid and air transfer were important issues. Telecommunication systems were best in China and worst in Pakistan. PTSD prevalence was highest in Iran. Respiratory infection was the most common infection following all 5 earthquakes. Earthquake damage, death toll, managerial protocols etc. vary in different countries and are influenced by many factors including the hour the earthquake hits and the day of the week. Additionally, social, structural and geographic factors as well as the medical, governmental and NGO respondents are influential. Engineered residential construction remains to be of importance in reducing mortality in developing countries. It is essential that hospitals, fire departments and police stations, water, telephone and electrical facilities be made earthquake proof.
10 CFR 50.54 - Conditions of licenses.
Code of Federal Regulations, 2014 CFR
2014-01-01
... chapter. (a)(1) Each nuclear power plant or fuel reprocessing plant licensee subject to the quality... irradiated fuel. (ff) For licensees of nuclear power plants that have implemented the earthquake engineering... of rated thermal power only if the Commission finds that the state of onsite emergency preparedness...
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links About PEER What is PEER ? Mission and Goals PEER's Benefits to California PEER Core Institutions PEER's Industry Partners PEER's
A novel HVSR approach on structural heath monitoring for structural vulnerability assesement
NASA Astrophysics Data System (ADS)
Pentaris, Fragkiskos P.; Papadopoulos, Ilias
2014-05-01
This work suggests a novel approach for vulnerability assessment in structural health monitoring (SHM) through Horizontal to Vertical Spectral Ratio (HVSR) method. Acceleration recordings of different age concrete buildings [1] are analyzed using the conventional method for estimation of fundamental frequency in SHM (Fast Furrier Transform-FFT method). The results of frequency spectrum are verified theoretically (mass and stiffness matrices models) but also by practical techniques applied in real structure data, for the estimation of structural resonance frequencies [2-4]. The same recordings are analyzed by HVSR method and study the differences and the similarities of both methods (FFT and HVSR) under earthquake excitation. Both methods can reveal resonance frequencies and amplitude of buildings under study, with great detail and efficiency in terms of ease of deployment, computation, cost and time. Furthermore HVSR recordings of strong seismic motion are compared with HVSR recordings of ambient noise for the case study buildings. The similarities of HVSR recordings (between earthquake and ambient noise) reveal the same analogy in HVSR spectrum. This enables a simple HVSR noise recording in a building to present the same information with an earthquake HVSR recording that are much rarer. This study presents a novel index which compute the increase of HVSR between floors and correlates the increasing rate with the structural vulnerability of the specific building. The main idea is that this HVSR rate index is strongly related with the differential acceleration (between floors), a determinant measurement for SHM assessment in concrete buildings. Experimental data verify the above HVSR rise index by presentation of higher HVSR rise in older buildings (with visible cracks in beams, damage and stress in their structure) than other younger buildings without any visible damage. Acknowledgments This work was supported in part by the ARCHEMEDES III Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled «Interdisciplinary Multi-Scale Research of Earthquake Physics and Seismotectonics at the front of the Hellenic Arc (IMPACT-ARC) ». References [1] F. P. Pentaris, J. Stonham, and J. P. Makris, "A review of the state-of-the-art of wireless SHM systems and an experimental set-up towards an improved design," presented at the EUROCON, 2013 IEEE, Zagreb, 2013. [2] R. Ditommaso, M. Mucciarelli, S. Parolai, and M. Picozzi, "Monitoring the structural dynamic response of a masonry tower: Comparing classical and time-frequency analyses," Bulletin of Earthquake Engineering, vol. 10, pp. 1221-1235, 2012. [3] Sungkono, D. D. Warnana, Triwulan, and W. Utama, "Evaluation of Buildings Strength from Microtremor Analyses " International Journal of Civil & Environmental Engineering IJCEE-IJENS, vol. 11, 2011. [4] L.-L. Hong and W.-L. Hwang, "Empirical formula for fundamental vibration periods of reinforced concrete buildings in Taiwan," Earthquake Engineering & Structural Dynamics, vol. 29, pp. 327-337, 2000.
LIDAR Investigation Of The 2004 Niigata Ken Chuetsu, Japan, Earthquake
NASA Astrophysics Data System (ADS)
Kayen, R.; Pack, R. T.; Sugimoto, S.; Tanaka, H.
2005-12-01
The 23 October 2004 Niigata Ken Chuetsu, Japan, Mw 6.6 earthquake was the most significant earthquake to affect Japan since the 1995 Kobe earthquake. Forty people were killed, almost 3,000 injured, and numerous landslides destroyed entire upland villages. Landslides and permanent ground deformation caused extensive damage to roads, rail lines and other lifelines, resulting in major economic disruption. The cities and towns most significantly affected by the earthquake were Nagaoka, Ojiya, and the mountainous rural areas of Yamakoshi village and Kawaguchi town. Our EERI team traveled with a tripod mounted LIDAR (Light Detection and Ranging) unit, a scanning-laser that creates ultra high-resolution 3-D digital terrain models of the earthquake damaged surfaces the ground, structures, and life-lines. This new technology allows for rapid and remote sensing of damaged terrain. Ground-based LIDAR has an accuracy range of 0.5-2.5 cm, and can illuminate targets up to 400m away from the sensor. During a single tripod-mounted LIDAR scan of 10 minutes, several million survey points are collected and processed into an ultra-high resolution terrain model of the damaged ground or structure. There are several benefits in acquiring these LIDAR data in the initial reconnaissance effort after the earthquake. First, we record the detailed failure morphologies of damaged ground and structures in order to make measurements that are either impractical or impossible by conventional survey means. The digital terrain models allow us to enlarge, enhance and rotate data in order to visualize damage in orientations and scales not previously possible. This ability to visualize damage allows us to better understand failure modes. Finally, LIDAR allows us to archive 3-D terrain models so that the engineering community can evaluate analytical and numerical models of deformation potential against detailed field measurements. Here, we discuss the findings of this 2004 Niigata Chuetsu Earthquake (M6.6) reconnaissance presented with LIDAR examples for damage-visualization.
NASA Astrophysics Data System (ADS)
Rodgers, A. J.; Pitarka, A.; Petersson, N. A.; Sjogreen, B.; McCallen, D.; Miah, M.
2016-12-01
Simulation of earthquake ground motions is becoming more widely used due to improvements of numerical methods, development of ever more efficient computer programs (codes), and growth in and access to High-Performance Computing (HPC). We report on how SW4 can be used for accurate and efficient simulations of earthquake strong motions. SW4 is an anelastic finite difference code based on a fourth order summation-by-parts displacement formulation. It is parallelized and can run on one or many processors. SW4 has many desirable features for seismic strong motion simulation: incorporation of surface topography; automatic mesh generation; mesh refinement; attenuation and supergrid boundary conditions. It also has several ways to introduce 3D models and sources (including Standard Rupture Format for extended sources). We are using SW4 to simulate strong ground motions for several applications. We are performing parametric studies of near-fault motions from moderate earthquakes to investigate basin edge generated waves and large earthquakes to provide motions to engineers study building response. We show that 3D propagation near basin edges can generate significant amplifications relative to 1D analysis. SW4 is also being used to model earthquakes in the San Francisco Bay Area. This includes modeling moderate (M3.5-5) events to evaluate the United States Geologic Survey's 3D model of regional structure as well as strong motions from the 2014 South Napa earthquake and possible large scenario events. Recently SW4 was built on a Commodity Technology Systems-1 (CTS-1) at LLNL, new systems for capacity computing at the DOE National Labs. We find SW4 scales well and runs faster on these systems compared to the previous generation of LINUX clusters.
The HayWired Earthquake Scenario
Detweiler, Shane T.; Wein, Anne M.
2017-04-24
ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the interconnectedness of infrastructure, society, and our economy. How would this earthquake scenario, striking close to Silicon Valley, impact our interconnected world in ways and at a scale we have not experienced in any previous domestic earthquake?The area of present-day Contra Costa, Alameda, and Santa Clara Counties contended with a magnitude-6.8 earthquake in 1868 on the Hayward Fault. Although sparsely populated then, about 30 people were killed and extensive property damage resulted. The question of what an earthquake like that would do today has been examined before and is now revisited in the HayWired scenario. Scientists have documented a series of prehistoric earthquakes on the Hayward Fault and are confident that the threat of a future earthquake, like that modeled in the HayWired scenario, is real and could happen at any time. The team assembled to build this scenario has brought innovative new approaches to examining the natural hazards, impacts, and consequences of such an event. Such an earthquake would also be accompanied by widespread liquefaction and landslides, which are treated in greater detail than ever before. The team also considers how the now-prototype ShakeAlert earthquake early warning system could provide useful public alerts and automatic actions.Scientific Investigations Report 2017–5013 and accompanying data releases are the products of an effort led by the USGS, but this body of work was created through the combined efforts of a large team including partners who have come together to form the HayWired Coalition (see chapter A). Use of the HayWired scenario has already begun. More than a full year of intensive partner engagement, beginning in April 2017, is being directed toward producing the most in-depth look ever at the impacts and consequences of a large earthquake on the Hayward Fault. With the HayWired scenario, our hope is to encourage and support the active ongoing engagement of the entire community of the San Francisco Bay region by providing the scientific, engineering, and economic and social science inputs for use in exercises and planning well into the future.As HayWired volumes are published, they will be made available at https://doi.org/10.3133/sir20175013.
Quantification of social contributions to earthquake mortality
NASA Astrophysics Data System (ADS)
Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.
2013-12-01
Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.
Seismic Intensity, PGA and PGV for the South Napa Earthquake, August 24, 2014
NASA Astrophysics Data System (ADS)
Chen, S.; Pickering, A.; Mooney, W. D.; Crewdson, E.
2014-12-01
Numerous studies have investigated the statistical relationship between Modified Mercalli Intensity (MMI) and peak ground acceleration (PGA) and peak ground velocity (PGV). The Mw 6.0 South Napa (California) earthquake of August 24, 2014 provides valuable data to examine these relationships for both urban and rural environments within northern California. The finite fault model (D. Dreger, 2014) indicates that the fault rupture propagated predominantly NNW and up-dip from a 11-km-deep hypocenter. The epicentral location was 8 km SSW of downtown Napa. Recorded PGA north of the epicenter is as high as 600 cm/s2 and PGV locally reaches 80 cm/s. Field studies by two teams of investigators were conducted on August 24-26 and September 8-11, 2014 to assign MMI values at 108 sites. The highest MMI values are strongly localized along the NNW-SSE rupture trend north of the epicenter. Parts of the city of Napa and some communities several km farther north on Dry Creek Road were found to have experienced shaking intensities of MMI VII to VIII. The observed effects include houses moved off their foundations, chimney collapse or damage, cracked foundations and/or interior walls, broken windows, and the lateral displacement of heavy furniture. Communities to the east and west of this zone of high seismic intensity reported significantly lower values of MMI, typically IV and V even at distances as close as 10 km from the mapped surface rupture. In comparison with previous estimates by Wald et al. (1999) and Dangkua and Cramer (2011), we find that MMI III-VIII uniformly correspond to significantly larger (>150%) PGA and PGV values, as reported by the Center for Engineering Strong Motion Data (2014). We attribute this observation to two primary factors: (1) improved earthquake engineering in the post-Loma Prieta earthquake era that has led to building construction, both new and retrofitted, that is more resistant to earthquake strong ground motions; and (2) a frequency band relevant to these MMI estimates that contains less energy than that leading to the PGA or PGV values. The latter would primarily be a source effect.
Loss Estimations due to Earthquakes and Secondary Technological Hazards
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2009-04-01
Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.
Housing Damage Following Earthquake
NASA Technical Reports Server (NTRS)
1989-01-01
An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.
NASA Astrophysics Data System (ADS)
Spear, B.; Haritashya, U. K.; Kargel, J. S.
2017-12-01
Gorkha Nepal has been a hot bed of landslide activity since the 7.8 magnitude earthquake that occurred on April 25th 2015. Even though previous studies have mapped and analyzed the landslides that were directly related to the earthquake, this research maps and analyzes the landslides that occurred during monsoon or after monsoon season in 2015 and 2016. Specifically, our objectives included monitoring post-earthquake landslide evolution and reactivation. We also observed landslides which occurred in the steep side slopes of various small rivers and threatened to block the flow of river. Consequently, we used Landsat, Sentinel, ASTER and images available at Google Earth Engine to locate, map, and analyze these landslides. Our preliminary result indicates 5,270 landslides, however 957 of these landslides occurred significantly after the earthquake. Of the 957 landslides, 508 of them occurred during the monsoon season of 2015 and 48 in the 2016 monsoon season. As well as locating and mapping these landslides, we were able to identify that there were 22 landslides blocking rivers and 24 were reactivated. Our result and landslide density maps clearly identifies zones that are prone to landslides. For example, the steepest areas, such as the Helambu or Langtang region, have a very high concentration of landslides since the earthquake. Furthermore, landslides with the largest area were often nearby each other in very steep regions. This research can be used to determine which areas in the Gorkha Nepal region are safe to use and which areas are high risk.
NASA Astrophysics Data System (ADS)
Pearson, Chris; Manandhar, Niraj; Denys, Paul
2017-09-01
Along with the damage to buildings and infrastructure, the April 25, 2015 Mw7.8 Gorkha earthquake caused significant deformation over a large area of eastern Nepal with displacements of over 2 m recorded in the vicinity of Kathmandu. Nepal currently uses a classical datum developed in 1984 by the Royal (UK) Engineers in collaboration with the Nepal Survey Department. It has served Nepal well; however, the recent earthquakes have provided an impetus for developing a semi-dynamic datum that will be based on ITRF2014 and have the capacity to correct for tectonic deformation. In the scenario we present here, the datum would be based on ITRF2014 with a reference epoch set some time after the end of the current sequence of earthquakes. The deformation model contains a grid of the secular velocity field combined with models of the Gorkha Earthquake and the May 12 Mw7.3 aftershock. We have developed a preliminary velocity field by collating GPS derived crustal velocities from four previous studies for Nepal and adjacent parts of China and India and aligning them to the ITRF. Patches for the co-seismic part of the deformation for the Gorkha earthquake and the May 12, 2015 Mw 7.2 aftershock are based on published dislocation models. High order control would be a CORS network based around the existing Nepal GPS Array. Coordinates for existing lower order control would be determined by readjusting existing survey measurements and these would be combined with a series of new control stations spread throughout Nepal.
Jibson, Randall W.; Jibson, Matthew W.
2003-01-01
Landslides typically cause a large proportion of earthquake damage, and the ability to predict slope performance during earthquakes is important for many types of seismic-hazard analysis and for the design of engineered slopes. Newmark's method for modeling a landslide as a rigid-plastic block sliding on an inclined plane provides a useful method for predicting approximate landslide displacements. Newmark's method estimates the displacement of a potential landslide block as it is subjected to earthquake shaking from a specific strong-motion record (earthquake acceleration-time history). A modification of Newmark's method, decoupled analysis, allows modeling landslides that are not assumed to be rigid blocks. This open-file report is available on CD-ROM and contains Java programs intended to facilitate performing both rigorous and simplified Newmark sliding-block analysis and a simplified model of decoupled analysis. For rigorous analysis, 2160 strong-motion records from 29 earthquakes are included along with a search interface for selecting records based on a wide variety of record properties. Utilities are available that allow users to add their own records to the program and use them for conducting Newmark analyses. Also included is a document containing detailed information about how to use Newmark's method to model dynamic slope performance. This program will run on any platform that supports the Java Runtime Environment (JRE) version 1.3, including Windows, Mac OSX, Linux, Solaris, etc. A minimum of 64 MB of available RAM is needed, and the fully installed program requires 400 MB of disk space.
Toda, Shinji; Stein, Ross
2015-01-01
The strongest San Francisco Bay area earthquake since the 1989 Mw 7.0 Loma Prieta shock struck near Napa on 24 August 2014. Field mapping (Dawson et al., 2014; Earthquake Engineering Research Institute [EERI], 2014; Brocher et al., 2015) and seismic and geodetic source inversions (Barnhart et al., 2015; Dreger et al., 2015; Wei et al., 2015) indicate that a 15-km-long northwest-trending section of the West Napa Valley fault ruptured in the earthquake. Remarkably, it was the first indisputable surface rupture in the Bay area since 1906. The Napa event, along with other smaller earthquakes such as the 1980 Mw 5.8 Livermore and 1984 Mw 6.2 Morgan Hill events on the Calaveras and Hayward faults over the past 3–4 decades, may indicate that the Bay area region is emerging from the stress shadow of the 1906 Mw 7.8 San Francisco earthquake (Harris and Simpson, 1998; Pollitz et al., 2004). Since 1979, there has been a 140% increase in the rate of Mw≥4.1 shocks (Fig. 1) in the broader Bay area, with most concentrated in a corridor extending north from the 1989 Loma Prieta aftershock zone through the Calaveras, Greenville, Green Valley, Napa, and Rodgers Creek faults east of the San Francisco Bay (Fig. 1a). This corridor roughly coincides with the 1906 stress shadow that is being eroded away by more than a century of stress reaccumulation. The Napa event, as well as the surrounding faults on which we calculate the resulting hazard increases, all lie within this zone.
PEER Business and Industry Partnership (BIP)
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Industry Partnership (BIP) Industry and government partners are an integral part of the research program at research and education programs and provides access to PEER researchers and products. Researchers share
Use of expert judgment elicitation to estimate seismic vulnerability of selected building types
Jaiswal, K.S.; Aspinall, W.; Perkins, D.; Wald, D.; Porter, K.A.
2012-01-01
Pooling engineering input on earthquake building vulnerability through an expert judgment elicitation process requires careful deliberation. This article provides an overview of expert judgment procedures including the Delphi approach and the Cooke performance-based method to estimate the seismic vulnerability of a building category.
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links About PEER What is PEER ? Mission and Goals PEER's Benefits to California PEER Core Institutions PEER's Industry Partners PEER's
DOT National Transportation Integrated Search
1997-05-01
This document presents a series of five design examples illustrating the principles and methods of geotechnical earthquake engineering and seismic design for highway facilities. These principles and methods are described in Volume I - Design Principl...
Editorial: Global Science and Technology in Undergraduate Science and Engineering Education.
ERIC Educational Resources Information Center
Paldy, Lester G., Ed.
1984-01-01
Offers reasons why students should be exposed to and understand the implications of the global character of science and technology. Examples of scientific/technical issues and problems which are global in their scope are long-term atmospheric warming trends, weather forecasting, desertification, earthquake prediction, acid rain, and nuclear…
Effect of water content on stability of landslides triggered by earthquakes
NASA Astrophysics Data System (ADS)
Beyabanaki, S.; Bagtzoglou, A. C.; Anagnostou, E. N.
2013-12-01
Earthquake- triggered landslides are one of the most important natural hazards that often result in serious structural damage and loss of life. They are widely studied by several researchers. However, less attention has been focused on soil water content. Although the effect of water content has been widely studied for rainfall- triggered landslides [1], much less attention has been given to it for stability analysis of earthquake- triggered landslides. We developed a combined hydrology and stability model to investigate effect of soil water content on earthquake-triggered landslides. For this purpose, Bishop's method is used to do the slope stability analysis and Richard's equation is employed to model infiltration. Bishop's method is one the most widely methods used for analyzing stability of slopes [2]. Earthquake acceleration coefficient (EAC) is also considered in the model to analyze the effect of earthquake on slope stability. Also, this model is able to automatically determine geometry of the potential landslide. In this study, slopes with different initial water contents are simulated. First, the simulation is performed in the case of earthquake only with different EACs and water contents. As shown in Fig. 1, initial water content has a significant effect on factor of safety (FS). Greater initial water contents lead to less FS. This impact is more significant when EAC is small. Also, when initial water content is high, landslides can happen even with small earthquake accelerations. Moreover, in this study, effect of water content on geometry of landslides is investigated. For this purpose, different cases of landslides triggered by earthquakes only and both rainfall and earthquake for different initial water contents are simulated. The results show that water content has more significant effect on geometry of landslides triggered by rainfall than those triggered by an earthquake. Finally, effect of water content on landslides triggered by earthquakes during rainfall is investigated. In this study, after different durations of rainfall, an earthquake is applied to the model and the elapsed time in which the FS gets less than one obtains by trial and error. The results for different initial water contents and earthquake acceleration coefficients show that landslides can happen after shorter rainfall duration when water content is greater. If water content is high enough, the landslide occurs even without rainfall. References [1] Ray RL, Jacobs JM, de Alba P. Impact of unsaturated zone soil moisture and groundwater table on slope instability. J. Geotech. Geoenviron. Eng., 2010, 136(10):1448-1458. [2] Das B. Principles of Foundation Engineering. Stanford, Cengage Learning, 2011. Fig. 1. Effect of initial water content on FS for different EACs
Awareness and understanding of earthquake hazards at school
NASA Astrophysics Data System (ADS)
Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi
2014-05-01
Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train selected students as communicators so that they can transfer simple educational messages on the seismic risk reduction to other students and/or to the whole community. The experiment is taking place in North East Italy, an area on which OGS detect earthquakes for seismological study and seismic alarm purposes. Teachers and students participating in the project are expected to present their achieved experience during a public event, at University of Udine (Italy).
Yehle, Lynn A.
1978-01-01
A program to study the engineering geology of most larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about the Petersburg area is a product of that program. Field-study methods were of a reconnaissance nature, and thus, interpretations in the report are tentative. Landscape of the northern end of Mitkof Island on which Petersburg is situated is characterized by a gently sloping, muskeg-covered terrain, with altitudes mostly less than 30 m. In contrast, much of the rest of the island is composed of mountainous terrain with many steep valleys. During the Pleistocene Epoch, the Petersburg area presumably was covered by ice several times; glaciers deeply eroded many valleys on Mitkof Island and adjacent areas. The last major deglaciation probably was largely completed by 12,000 years ago. Delayed rebound of the earth's crust, after the melting of large amounts of ice, permitted extensive inundation of land in the Petersburg area. Subsequently, emergence has elevated marine deposits to a present-day altitude of at least 65 m and probably to 75 m. Bedrock in the Petersburg map area is composed of relatively hard metamorphic rocks, chiefly phyllite and probably some graywacke. Rocks are of Middle(?) Jurassic to Early Cretaceous age. Five types of surficial geologic material of Quaternary age were recognized: (1) mixed deposits consisting of diamicton, silt-clay, and sand or sandy pebble gravel, (2) alluvial deposits, (3) shore and delta deposits, (4) organic deposits, and (5) artificial fill. Geologic structure in southeastern Alaska is complex because several cycles of tectonic deformation since at least early Paelozoic time have affected different parts of the region. The latest of the major tectonic events in southeastern Alaska occurred in Tertiary time, with some minor activity continuing into the Quaternary Period. Along the outer coast of southeastern Alaska, active strike-slip movement is occurring along the Chichagof-Baranof and Queen Charlotte faults. A segment of the prominent Coast-Range lineament, part of which may be a fault, lies 18 km northeast of Petersburg. Many earthquakes occur along the outer coast of southeastern Alaska. Most of these shocks are associated with movements along the Chichagof-Baranof, Queen Charlotte, and Transition faults. A few small earthquakes occur in the region between the outer coast and the southern part of the Coast Mountains. 0nly a few earthquakes have been recorded as felt at Petersburg; these shocks and others possibly felt in the Petersburg region are tabulated. Among the recorded earthquakes the highest intensity (about V-VI) as the magnitude 7.1 earthquake of October 24, 1927, that occurred probably along the Chichagof-Baranof fault, and about 225 km northwest of Petersburg; damage was reported as minor. Other large earthquakes along the Chichagof-Baranof fault that affected or probably affected the Petersburg area in a minor way occurred on August 22, 1949 (magnitude 8.1) and on July 30, 1972 (magnitude 7.25). From a consideration of the tectonics and earthquake history of the region, earthquakes similar to the 1927, 1949, and 1972 shocks are expected to recur on segments of the Chichagof-Baranof or Queen Charlotte faults. The closest of these fault segments is about 170 km southwest from Petersburg. The likelihood of destructive earthquakes being generated along faults closer to Petersburg is unknown. A very generalized discussion of possible geologic effects that could occur in the area during a postulated, theoretically reasonable worst case earthquake of magnitude 8 occurring along the outer coast about 170 km southwest from Petersburg notes that ground shaking probably would be strongest on organic deposits and least on bedrock and on firm, compact diamicton. Among other effects that could happen are: (1) liquefaction of some of the few delta and alluvial
1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska
Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.
2014-01-01
On March 27, 1964, at 5:36 p.m., a magnitude 9.2 earthquake, the largest recorded earthquake in U.S. history, struck southcentral Alaska (fig. 1). The Great Alaska Earthquake (also known as the Good Friday Earthquake) occurred at a pivotal time in the history of earth science, and helped lead to the acceptance of plate tectonic theory (Cox, 1973; Brocher and others, 2014). All large subduction zone earthquakes are understood through insights learned from the 1964 event, and observations and interpretations of the earthquake have influenced the design of infrastructure and seismic monitoring systems now in place. The earthquake caused extensive damage across the State, and triggered local tsunamis that devastated the Alaskan towns of Whittier, Valdez, and Seward. In Anchorage, the main cause of damage was ground shaking, which lasted approximately 4.5 minutes. Many buildings could not withstand this motion and were damaged or collapsed even though their foundations remained intact. More significantly, ground shaking triggered a number of landslides along coastal and drainage valley bluffs underlain by the Bootlegger Cove Formation, a composite of facies containing variably mixed gravel, sand, silt, and clay which were deposited over much of upper Cook Inlet during the Late Pleistocene (Ulery and others, 1983). Cyclic (or strain) softening of the more sensitive clay facies caused overlying blocks of soil to slide sideways along surfaces dipping by only a few degrees. This guide is the document version of an interactive web map that was created as part of the commemoration events for the 50th anniversary of the 1964 Great Alaska Earthquake. It is accessible at the U.S. Geological Survey (USGS) Alaska Science Center website: http://alaska.usgs.gov/announcements/news/1964Earthquake/. The website features a map display with suggested tour stops in Anchorage, historical photographs taken shortly after the earthquake, repeat photography of selected sites, scanned documents, and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.
NASA Astrophysics Data System (ADS)
Lapusta, N.
2011-12-01
Studying earthquake source processes is a multidisciplinary endeavor involving a number of subjects, from geophysics to engineering. As a solid mechanician interested in understanding earthquakes through physics-based computational modeling and comparison with observations, I need to educate and attract students from diverse areas. My CAREER award has provided the crucial support for the initiation of this effort. Applying for the award made me to go through careful initial planning in consultation with my colleagues and administration from two divisions, an important component of the eventual success of my path to tenure. Then, the long-term support directed at my program as a whole - and not a specific year-long task or subject area - allowed for the flexibility required for a start-up of a multidisciplinary undertaking. My research is directed towards formulating realistic fault models that incorporate state-of-the-art experimental studies, field observations, and analytical models. The goal is to compare the model response - in terms of long-term fault behavior that includes both sequences of simulated earthquakes and aseismic phenomena - with observations, to identify appropriate constitutive laws and parameter ranges. CAREER funding has enabled my group to develop a sophisticated 3D modeling approach that we have used to understand patterns of seismic and aseismic fault slip on the Sunda megathrust in Sumatra, investigate the effect of variable hydraulic properties on fault behavior, with application to Chi-Chi and Tohoku earthquake, create a model of the Parkfield segment of the San Andreas fault that reproduces both long-term and short-term features of the M6 earthquake sequence there, and design experiments with laboratory earthquakes, among several other studies. A critical ingredient in this research program has been the fully integrated educational component that allowed me, on the one hand, to expose students from different backgrounds to the multidisciplinary knowledge required for research in my group and, on the other hand, to communicate the field insights to a broader community. Newly developed course on Dynamic Fracture and Frictional Faulting has combined geophysical and engineering knowledge at the forefront of current research activities relevant to earthquake studies and involved students in these activities through team-based course projects. The course attracts students from more than ten disciplines and received a student rating of 4.8/5 this past academic year. In addition, the course on Continuum Mechanics was enriched with geophysical references and examples. My group has also been visiting physics classrooms in a neighboring public school that serve mostly underrepresented minorities. The visits were beneficial not only to the high school students but also for graduate students and postdocs in my group, who got experience in presenting their field in a way accessible for the general public. Overall, the NSF CAREER award program through the Geosciences Directorate (NSF official Eva E. Zanzerkia) has significantly facilitated my development as a researcher and educator and should be either maintained or expanded.
OpenQuake, a platform for collaborative seismic hazard and risk assessment
NASA Astrophysics Data System (ADS)
Henshaw, Paul; Burton, Christopher; Butler, Lars; Crowley, Helen; Danciu, Laurentiu; Nastasi, Matteo; Monelli, Damiano; Pagani, Marco; Panzeri, Luigi; Simionato, Michele; Silva, Vitor; Vallarelli, Giuseppe; Weatherill, Graeme; Wyss, Ben
2013-04-01
Sharing of data and risk information, best practices, and approaches across the globe is key to assessing risk more effectively. Through global projects, open-source IT development and collaborations with more than 10 regions, leading experts are collaboratively developing unique global datasets, best practice, tools and models for global seismic hazard and risk assessment, within the context of the Global Earthquake Model (GEM). Guided by the needs and experiences of governments, companies and international organisations, all contributions are being integrated into OpenQuake: a web-based platform that - together with other resources - will become accessible in 2014. With OpenQuake, stakeholders worldwide will be able to calculate, visualize and investigate earthquake hazard and risk, capture new data and share findings for joint learning. The platform is envisaged as a collaborative hub for earthquake risk assessment, used at global and local scales, around which an active network of users has formed. OpenQuake will comprise both online and offline tools, many of which can also be used independently. One of the first steps in OpenQuake development was the creation of open-source software for advanced seismic hazard and risk calculations at any scale, the OpenQuake Engine. Although in continuous development, a command-line version of the software is already being test-driven and used by hundreds worldwide; from non-profits in Central Asia, seismologists in sub-Saharan Africa and companies in South Asia to the European seismic hazard harmonization programme (SHARE). In addition, several technical trainings were organized with scientists from different regions of the world (sub-Saharan Africa, Central Asia, Asia-Pacific) to introduce the engine and other OpenQuake tools to the community, something that will continue to happen over the coming years. Other tools that are being developed of direct interest to the hazard community are: • OpenQuake Modeller; fundamental instruments for the creation of seismogenic input models for seismic hazard assessment, a critical input to the OpenQuake Engine. OpenQuake Modeller will consist of a suite of tools (Hazard Modellers Toolkit) for characterizing the seismogenic sources of earthquakes and their models of earthquakes recurrence. An earthquake catalogue homogenization tool, for integration, statistical comparison and user-defined harmonization of multiple catalogues of earthquakes is also included in the OpenQuake modeling tools. • A data capture tool for active faults; a tool that allows geologists to draw (new) fault discoveries on a map in an intuitive GIS-environment and add details on the fault through the tool. This data, once quality checked, can then be integrated with the global active faults database, which will increase in value with every new fault insertion. Building on many ongoing efforts and the knowledge of scientists worldwide, GEM will for the first time integrate state-of-the-art data, models, results and open-source tools into a single platform. The platform will continue to increase in value, in particular for use in local contexts, through contributions from and collaborations with scientists and organisations worldwide. This presentation will showcase the OpenQuake Platform, focusing on the IT solutions that have been adopted as well as the added value that the platform will bring to scientists worldwide.
The Australian Computational Earth Systems Simulator
NASA Astrophysics Data System (ADS)
Mora, P.; Muhlhaus, H.; Lister, G.; Dyskin, A.; Place, D.; Appelbe, B.; Nimmervoll, N.; Abramson, D.
2001-12-01
Numerical simulation of the physics and dynamics of the entire earth system offers an outstanding opportunity for advancing earth system science and technology but represents a major challenge due to the range of scales and physical processes involved, as well as the magnitude of the software engineering effort required. However, new simulation and computer technologies are bringing this objective within reach. Under a special competitive national funding scheme to establish new Major National Research Facilities (MNRF), the Australian government together with a consortium of Universities and research institutions have funded construction of the Australian Computational Earth Systems Simulator (ACcESS). The Simulator or computational virtual earth will provide the research infrastructure to the Australian earth systems science community required for simulations of dynamical earth processes at scales ranging from microscopic to global. It will consist of thematic supercomputer infrastructure and an earth systems simulation software system. The Simulator models and software will be constructed over a five year period by a multi-disciplinary team of computational scientists, mathematicians, earth scientists, civil engineers and software engineers. The construction team will integrate numerical simulation models (3D discrete elements/lattice solid model, particle-in-cell large deformation finite-element method, stress reconstruction models, multi-scale continuum models etc) with geophysical, geological and tectonic models, through advanced software engineering and visualization technologies. When fully constructed, the Simulator aims to provide the software and hardware infrastructure needed to model solid earth phenomena including global scale dynamics and mineralisation processes, crustal scale processes including plate tectonics, mountain building, interacting fault system dynamics, and micro-scale processes that control the geological, physical and dynamic behaviour of earth systems. ACcESS represents a part of Australia's contribution to the APEC Cooperation for Earthquake Simulation (ACES) international initiative. Together with other national earth systems science initiatives including the Japanese Earth Simulator and US General Earthquake Model projects, ACcESS aims to provide a driver for scientific advancement and technological breakthroughs including: quantum leaps in understanding of earth evolution at global, crustal, regional and microscopic scales; new knowledge of the physics of crustal fault systems required to underpin the grand challenge of earthquake prediction; new understanding and predictive capabilities of geological processes such as tectonics and mineralisation.
Ground failure in the 2001 Mw 8.4 southern Peru earthquake
NASA Astrophysics Data System (ADS)
Rondinel-Oviedo, Efrain Alejandro
On June 23rd 2001 a moment magnitude (M W) 8.4, earthquake shook the southern portion of Peru. This rare large-magnitude event provided a unique opportunity to develop a suite of high quality case histories and also to test and calibrate existing geotechnical earthquake engineering analysis procedures and models against observations from the earthquake. The work presented in this thesis is focused on three topics pertaining to ground failure (i.e., the permanent deformation of the ground resulting from an earthquake) observed during the event: (1) surface ground damage in small basin geometries, (2) seismic compression, and (3) performance of a concrete faced rockfill dam (CFRD) dam. Surface ground strain damage patterns in small basin geometries has previously been typically studied at the large (i.e., geological) scale, but not at the scale of civil engineering infrastructure. During seismic events basin geometries containing soft material confined by stiffer material trap the seismic waves and generate surface waves that travel on the ground along the soft material. Numerical modeling shows that surface waves are generated at basin edges and travel on the ground creating higher duration, higher response (peak ground acceleration, PGA), higher energy (Arias Intensity) and higher angular distortion, especially in zones close to the edges. The impedance contrast between the stiff material and the soft material, and the dip angle play an important role in basin response. Seismic compression (i.e., the shaking induced densification of unsaturated soil) was observed in many highway embankments in the region of the earthquake. In many instances, this phenomenon was exasperated by soil-structure interaction with adjacent bridge or culvert structures. Numerical modeling conducted as part of this research showed (i) a significantly different response when the structure (culvert) is considered, (ii) impedance contrast plays a role in the system responses, and (iii) low horizontal stresses are observed when the peak of the shear strain occurs. It is believed that the effect of low confining stresses was responsible for the large amounts of settlement observed, and which was not directly captured by classical seismic compression models. The third topic of study considered evaluates the performance of a concrete faced rockfill dam (CFRD) dam in the earthquake. Analysis considered the effect of the time, PGA of rock, and change in amplification ratio with PGA. It appears that the natural frequency of the dam increases with time in the transversal direction and slightly decreases in the longitudinal direction. It is believed that the increase in the natural frequency might be associated with change in the dam stiffness (i.e. densification) with time. However, reason for the slight decrease in the longitudinal direction is not clear and requires further research.
Study of Spectral Attenuation Laws of Seismic Waves for Michoacán state, México
NASA Astrophysics Data System (ADS)
Vazquez Rosas, R.; Aguirre, J.; Mijares Arellano, H.
2009-12-01
Several attenuation relationships have been developed for Mexico, mostly after the earthquake of September 19, 1985, an event that gave great impetus to the development of engineering seismology in Mexico. Since 1985, the number of seismic stations in the country has increased significantly, especially between the Coast of Guerrero and Mexico City. This is due to the infamous large amplifications observed in the lake area of Mexico City with respect to hard ground sites. Some studies have analyzed how seismic waves are attenuated or amplified from the Pacific Coast toward the inland. The attenuation relationship used for seismic hazard assessment in Mexico is that of Ordaz (1989), which uses data from the Guerrero acceleration network. Another recent study is that of García et al. (2005), which uses more recent data from intraplate earthquakes recorded at the Guerrero acceleration network. It is important to note that, since these relations were derived for only part of the Mexican subduction zone and for certain types of seismic sources, caution should be exercised when using them for earthquake risk studies in other regions of Mexico. In the present work, we study the state of Michoacán, one of the most important seimogenic zones in Mexico. Three kinds of sources exist in the state, producing tectonic earthquakes, volcanic earthquakes, and events due to local faults in the region. For this reason, it is of vital importance to study the propagation of seismic waves within Michoacán state, and in this paper in particular we study their attenuation. We installed a temporary network consisting of 7 accelerograph stations across the state, at the following locations: Faro de Brucerías, Aguililla, Apatzingán, Taretán, Pátzcuaro, Morelia, and Maravatío. The stations form a line that is perpendicular to the coastline and has a total length of 366 km, while the distance between neighboring stations varies from 60 to 80 km. Among all the seismic events recorded at this temporary network, we select 8 events that originated along the coastline of Michoacán, with moment magnitudes ranging from 4.3 to 5.1 Mw. Using these records, we calculate Q values for frequencies between 0.1 and 10 Hz, which is the frequency range of interest for Earthquake Engineering. According to our preliminary results, the attenuation estimated is significantly larger than what the attenuation laws predict for the states of Guerrero and Colima. One limitation of this study is that we used relatively small-magnitude earthquakes. This was a consequence of the relatively short operation period of the temporary network, which had to be limited to 3 months.
Engineering geological aspect of Gorkha Earthquake 2015, Nepal
NASA Astrophysics Data System (ADS)
Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen
2016-04-01
Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in the ground especially in the epicenter area. Similarly, liquefaction occurred in the different parts of Kathmandu valley. However, the recording in KATNP and DMG indicate that the ground motions that resulted from the quake were not strong enough to fully weaken liquefiable materials and in most cases incipient or "marginal" liquefaction was observed. Here, we will present a compilation of the different types of mass wasting that have occurred in this region and discuss their location and hazard potential for local communities. References: Adhikari, L.B., Gautam, U.P., Koirala, B.P., Bhattarai, M., Kandel, T., Gupta, R.M., Timsina, C., Maharjan, N., Maharjan, K., Dhahal, T., Hoste-Colomer, R., Cano, Y., Dandine, M., Guhem, A., Merrer, S., Roudil, P., Bollinger, L., 2015, The aftershock sequence of the 2015 April 25 Gorkha-Nepal Earthquake, Geophysical Journal International, v. 203 (3), pp. 2119-2124. Earthquake Without Frontiers, 2015, http://ewf.nerc.ac.uk/2015/05/12/nepal-update-on-landslide-hazard-following-12-may-2015-earthquake/ GEER, 2015: Geotechnical Extreme Event Reconnaissance http://www.geerassociation.org Moss, R.E.S., Thompson, E.M., Kieffer, D.S., Tiwari, B., Hashash, Y.M.A., Acharya, I., Adhikari B.R., Asimaki, D., Clahan, K.B., Collins, B.D., Dahal, S., Jibson, R.W., Khadka, D., Machdonald, A. Madugo C.L., Mason, H.B., Pehlivan., M., Rayamajhi, D. and Upreti. S., 2015, Geotechnical Effects of the 2015 AMgnitude 7.8 Gorkah, Nepal, Earthquake and Aftershocks, seismological Research Letters, v. 86(6), PP. 1514-1523 National Seismoligical Center, 2015, http://www.seismonepal.gov.np/
Deep Borehole Instrumentation Along San Francisco Bay Bridges - 2001
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchings, L.; Kasameyer, P.; Long, L.
2001-05-01
This is a progress report on the Bay Bridges downhole network. Between 2 and 8 instruments have been spaced along the Dumbarton, San Mateo, Bay, and San Rafael bridges in San Francisco Bay, California. The instruments will provide multiple use data that is important to geotechnical, structural engineering, and seismological studies. The holes are between 100 and 1000 ft deep and were drilled by Caltrans. There are twenty-one sensor packages at fifteen sites. The downhole instrument package contains a three component HS-1 seismometer and three orthogonal Wilcox 731 accelerometers, and is capable of recording a micro g from local Mmore » = 1.0 earthquakes to 0.5 g strong ground motion form large Bay Area earthquakes. This report list earthquakes and stations where recordings were obtained during the period February 29, 2000 to November 11, 2000. Also, preliminary results on noise analysis for up and down hole recordings at Yerba Buena Island is presented.« less
Celebi, M.; Bazzurro, P.; Chiaraluce, L.; Clemente, P.; Decanini, L.; Desortis, A.; Ellsworth, W.; Gorini, A.; Kalkan, E.; Marcucci, S.; Milana, G.; Mollaioli, F.; Olivieri, M.; Paolucci, R.; Rinaldis, D.; Rovelli, A.; Sabetta, F.; Stephens, C.
2010-01-01
The normal-faulting earthquake of 6 April 2009 in the Abruzzo Region of central Italy caused heavy losses of life and substantial damage to centuriesold buildings of significant cultural importance and to modern reinforcedconcrete- framed buildings with hollow masonry infill walls. Although structural deficiencies were significant and widespread, the study of the characteristics of strong motion data from the heavily affected area indicated that the short duration of strong shaking may have spared many more damaged buildings from collapsing. It is recognized that, with this caveat of shortduration shaking, the infill walls may have played a very important role in preventing further deterioration or collapse of many buildings. It is concluded that better new or retrofit construction practices that include reinforcedconcrete shear walls may prove helpful in reducing risks in such seismic areas of Italy, other Mediterranean countries, and even in United States, where there are large inventories of deficient structures. ?? 2010, Earthquake Engineering Research Institute.
Community Seismic Network (CSN)
NASA Astrophysics Data System (ADS)
Clayton, R. W.; Heaton, T. H.; Kohler, M. D.; Chandy, M.; Krause, A.
2010-12-01
In collaboration with computer science and earthquake engineering, we are developing a dense network of low-cost accelerometers that send their data via the Internet to a cloud-based center. The goal is to make block-by-block measurements of ground shaking in urban areas, which will provide emergency response information in the case of large earthquakes, and an unprecedented high-frequency seismic array to study structure and the earthquake process with moderate shaking. When deployed in high-rise buildings they can be used to monitor the state of health of the structure. The sensors are capable of a resolution of approximately 80 micro-g, connect via USB ports to desktop computers, and cost about $100 each. The network will adapt to its environment by using network-wide machine learning to adjust the picking sensitivity. We are also looking into using other motion sensing devices such as cell phones. For a pilot project, we plan to deploy more than 1000 sensors in the greater Pasadena area. The system is easily adaptable to other seismically vulnerable urban areas.
Geotechnical reconnaissance of the 2002 Denali fault, Alaska, earthquake
Kayen, R.; Thompson, E.; Minasian, D.; Moss, R.E.S.; Collins, B.D.; Sitar, N.; Dreger, D.; Carver, G.
2004-01-01
The 2002 M7.9 Denali fault earthquake resulted in 340 km of ruptures along three separate faults, causing widespread liquefaction in the fluvial deposits of the alpine valleys of the Alaska Range and eastern lowlands of the Tanana River. Areas affected by liquefaction are largely confined to Holocene alluvial deposits, man-made embankments, and backfills. Liquefaction damage, sparse surrounding the fault rupture in the western region, was abundant and severe on the eastern rivers: the Robertson, Slana, Tok, Chisana, Nabesna and Tanana Rivers. Synthetic seismograms from a kinematic source model suggest that the eastern region of the rupture zone had elevated strong-motion levels due to rupture directivity, supporting observations of elevated geotechnical damage. We use augered soil samples and shear-wave velocity profiles made with a portable apparatus for the spectral analysis of surface waves (SASW) to characterize soil properties and stiffness at liquefaction sites and three trans-Alaska pipeline pump station accelerometer locations. ?? 2004, Earthquake Engineering Research Institute.
Including foreshocks and aftershocks in time-independent probabilistic seismic hazard analyses
Boyd, Oliver S.
2012-01-01
Time‐independent probabilistic seismic‐hazard analysis treats each source as being temporally and spatially independent; hence foreshocks and aftershocks, which are both spatially and temporally dependent on the mainshock, are removed from earthquake catalogs. Yet, intuitively, these earthquakes should be considered part of the seismic hazard, capable of producing damaging ground motions. In this study, I consider the mainshock and its dependents as a time‐independent cluster, each cluster being temporally and spatially independent from any other. The cluster has a recurrence time of the mainshock; and, by considering the earthquakes in the cluster as a union of events, dependent events have an opportunity to contribute to seismic ground motions and hazard. Based on the methods of the U.S. Geological Survey for a high‐hazard site, the inclusion of dependent events causes ground motions that are exceeded at probability levels of engineering interest to increase by about 10% but could be as high as 20% if variations in aftershock productivity can be accounted for reliably.
Strong motion observations and recordings from the great Wenchuan Earthquake
Li, X.; Zhou, Z.; Yu, H.; Wen, R.; Lu, D.; Huang, M.; Zhou, Y.; Cu, J.
2008-01-01
The National Strong Motion Observation Network System (NSMONS) of China is briefly introduced in this paper. The NSMONS consists of permanent free-field stations, special observation arrays, mobile observatories and a network management system. During the Wenchuan Earthquake, over 1,400 components of acceleration records were obtained from 460 permanent free-field stations and three arrays for topographical effect and structural response observation in the network system from the main shock, and over 20,000 components of acceleration records from strong aftershocks occurred before August 1, 2008 were also obtained by permanent free-field stations of the NSMONS and 59 mobile instruments quickly deployed after the main shock. The strong motion recordings from the main shock and strong aftershocks are summarized in this paper. In the ground motion recordings, there are over 560 components with peak ground acceleration (PGA) over 10 Gal, the largest being 957.7 Gal. The largest PGA recorded during the aftershock exceeds 300 Gal. ?? 2008 Institute of Engineering Mechanics, China Earthquake Administration and Springer-Verlag GmbH.
NASA Astrophysics Data System (ADS)
Shiuly, Amit; Kumar, Vinay; Narayan, Jay
2014-06-01
This paper presents the ground motion amplification scenario along with fundamental frequency (F 0) of sedimentary deposit for the seismic microzonation of Kolkata City, situated on the world's largest delta island with very soft soil deposit. A 4th order accurate SH-wave viscoelastic finite-difference algorithm is used for computation of response of 1D model for each borehole location. Different maps, such as for F 0, amplification at F 0, average spectral amplification (ASA) in the different frequency bandwidth of earthquake engineering interest are developed for a variety of end-users communities. The obtained ASA of the order of 3-6 at most of the borehole locations in a frequency range of 0.25-10.0 Hz reveals that Kolkata City may suffer severe damage even during a moderate earthquake. Further, unexpected severe damage to collapse of multi-storey buildings may occur in localities near Hoogly River and Salt Lake area due to double resonance effects during distant large earthquakes.
The shakeout scenario: Meeting the needs for construction aggregates, asphalt, and concrete
Langer, W.H.
2011-01-01
An Mw 7.8 earthquake as described in the ShakeOut Scenario would cause significantdamage to buildings and infrastructure. Over 6 million tons of newly mined aggregate would be used for emergency repairs and for reconstruction in the five years following the event. This aggregate would be applied mostly in the form of concrete for buildings and bridges, asphalt or concrete for pavement, and unbound gravel for applications such as base course that goes under highway pavement and backfilling for foundations and pipelines. There are over 450 aggregate, concrete, and asphalt plants in the affected area, some of which would be heavily damaged. Meeting the increased demand for construction materials would require readily available permitted reserves, functioning production facilities, a supply of cement and asphalt, a source of water, gas, and electricity, and a trained workforce. Prudent advance preparations would facilitate a timely emergency response and reconstruction following such an earthquake. ?? 2011, Earthquake Engineering Research Institute.
NASA Astrophysics Data System (ADS)
Kamiyama, M.; Orourke, M. J.; Flores-Berrones, R.
1992-09-01
A new type of semi-empirical expression for scaling strong-motion peaks in terms of seismic source, propagation path, and local site conditions is derived. Peak acceleration, peak velocity, and peak displacement are analyzed in a similar fashion because they are interrelated. However, emphasis is placed on the peak velocity which is a key ground motion parameter for lifeline earthquake engineering studies. With the help of seismic source theories, the semi-empirical model is derived using strong motions obtained in Japan. In the derivation, statistical considerations are used in the selection of the model itself and the model parameters. Earthquake magnitude M and hypocentral distance r are selected as independent variables and the dummy variables are introduced to identify the amplification factor due to individual local site conditions. The resulting semi-empirical expressions for the peak acceleration, velocity, and displacement are then compared with strong-motion data observed during three earthquakes in the U.S. and Mexico.
NASA Astrophysics Data System (ADS)
Guo, Y.; Wang, H.; Deng, Z.; You, H.
2009-12-01
To research the ground destroyed features and tectonic stress field of the 2008 Wenchuan Earthquake, we went the earthquake-hazard area, Hongkou Town in Dujiangyan City, Yingxiu Town in Wenchuan County, Bailu Town in Pengzhou City, Yinghua Town in Shifang City, Hanwang Town in Mianzhu City and Beichuan Cit early and late twice in 2008. The geological survey was made. Firstly, the ground destroyed features of the Wenchuan Earthquake around both Yingxiu - Beichuan Fracture and Guanxian - Jiangyou Fracture were analyzed. They mainly display as the ground crack ground, road steep slope, ground deformation, road rise high and deformation, road staggering and rupture, etc. Besides, the Wenchuan Earthquake resulted in the great deal of building collapse and lots of bridges damage even break down; It can be seen that the first floor of the building disappeared or damaged seriously; Some building still stood there although damaged by the earthquake; A few of building was damaged slightly and kept intact structure. Furthermore, the earthquake caused earth slide, mudflow and rolling stone, which lead to the building destroyed seriously, river blocked up, the life line engineering destroyed. Secondly, the phenomena of the ground destroy were analyzed preliminarily. The seismic intensity was determined based on the field investigation. The damaged situation of the construction was concluded. Based on the principle of structure geology and making use of the Stereographic projection, the stress field was analyzed according to the attitude, structural nature and relations among the fracture, fault scratch and joint fissure as well as the characteristics of ground deformation thirdly. The geodynamics of the 2008 Wenchuan Earthquake are probed into preliminarily. The main compressive stress (the maximum main stress) σ1 took Northeast by east direction, and the main tensile stress (the minimum main stress)σ3 took Northwest by north direction. The main fracture shows as the right-lateral thrust fracture. The general horizontal diminution and the vertical upheaval of the ground are discussed. At last, the paper compared the relationship between the ground damage and the fracture in the area hit by the 2008 Wenchuan Earthquake. The method to avoid and mitigate the loss of treasure and life caused by the earthquake is proposed. The chief aspects that require the more attention for the reconstruction after disaster are given.
NASA Astrophysics Data System (ADS)
Delgado, José; García-Tortosa, Francisco J.; Garrido, Jesús; Giner, José; Lenti, Luca; López-Casado, Carlos; Martino, Salvatore; Peláez, José A.; Sanz de Galdeano, Carlos; Soler, Juan L.
2015-04-01
Landslides are a common ground effect induced by earthquakes of moderate to large magnitude. Most of them correspond to first-time instabilities induced by the seismic event, being the reactivation of pre-existing landslides less frequent in practice. The landslide of Güevejar (Granada province, S Spain) represents a case study of landslide that was reactivated, at least, two times by far field earthquakes: the Mw 8.7, 1755, Lisbon earthquake (with estimated epicentral distance of 680 km), and the Mw 6.5, 1884, Andalucia event (estimated epicentral distance of 45 km), but not by near field events of moderate magnitude (Mw < 6.0 and epicentral distances lower than 25 km). To study the seismic response of this landslide, a study has been conducted to elaborate an engineering-geological model. For this purpose, field work done included the elaboration of a detailed geological map (1:1000) of the landslide and surrounding areas, drilling of deep boreholes (80 m deep), down-hole measurement of both P and S wave velocities in the boreholes drilled, piezometric control of water table, MASW and ReMi profiles for determining the underlying structure of the sites tested (soil profile stratigraphy and the corresponding S-wave velocity of each soil level) and undisturbed sampling of the materials affected by the landslide. These samples were then tested in laboratory according to standard procedures for determination of both static (among which soil density, soil classification and shear strength) and dynamic properties (degradation curves for shear modulus and damping ratio with shear strain) of the landslide-involved materials. The model proposed corresponds to a complex landslide that combines a rototranslational mechanism with an earth-flow at its toe, which is characterized by a deep (> 50 m) sliding surface. The engineering-geological model constitutes the first step in an ongoing research devoted to understand how it could be reactivated during far field events. The authors would like to thank the ERDF of European Union for financial support via project "Monitorización sísmica de deslizamientos. Criterios de reactivación y alerta temprana" of the "Programa Operativo FEDER de Andalucía 2007-2015". We also thank all Public Works Agency and Ministry of Public Works and Housing of the Regional Government of Andalusia.
Introducing Students to Structural Dynamics and Earthquake Engineering
ERIC Educational Resources Information Center
Anthoine, Armelle; Marazzi, Francesco; Tirelli, Daniel
2010-01-01
The European Laboratory for Structural Assessment (ELSA) is one of the world's main laboratories for seismic studies. Besides its research activities, it also aims to bring applied science closer to the public. This article describes teaching activities based on a demonstration shaking table which is used to introduce the structural dynamics of…
Defense.gov - Special Report: Haiti Earthquake Relief
. Top Stories Medical Group Provides Assistance MANDRIN, Haiti, July 14, 2010 - Airmen with the 56th Medical Group provided optometry, dental and general services at the New Horizons medical site. Story assigned to Joint Task Force New Horizons have made major progress on their engineering and medical
NASA Astrophysics Data System (ADS)
Liang, Fayun; Chen, Haibing; Huang, Maosong
2017-07-01
To provide appropriate uses of nonlinear ground response analysis for engineering practice, a three-dimensional soil column with a distributed mass system and a time domain numerical analysis were implemented on the OpenSees simulation platform. The standard mesh of a three-dimensional soil column was suggested to be satisfied with the specified maximum frequency. The layered soil column was divided into multiple sub-soils with a different viscous damping matrix according to the shear velocities as the soil properties were significantly different. It was necessary to use a combination of other one-dimensional or three-dimensional nonlinear seismic ground analysis programs to confirm the applicability of nonlinear seismic ground motion response analysis procedures in soft soil or for strong earthquakes. The accuracy of the three-dimensional soil column finite element method was verified by dynamic centrifuge model testing under different peak accelerations of the earthquake. As a result, nonlinear seismic ground motion response analysis procedures were improved in this study. The accuracy and efficiency of the three-dimensional seismic ground response analysis can be adapted to the requirements of engineering practice.
Effects of Irregular Bridge Columns and Feasibility of Seismic Regularity
NASA Astrophysics Data System (ADS)
Thomas, Abey E.
2018-05-01
Bridges with unequal column height is one of the main irregularities in bridge design particularly while negotiating steep valleys, making the bridges vulnerable to seismic action. The desirable behaviour of bridge columns towards seismic loading is that, they should perform in a regular fashion, i.e. the capacity of each column should be utilized evenly. But, this type of behaviour is often missing when the column heights are unequal along the length of the bridge, allowing short columns to bear the maximum lateral load. In the present study, the effects of unequal column height on the global seismic performance of bridges are studied using pushover analysis. Codes such as CalTrans (Engineering service center, earthquake engineering branch, 2013) and EC-8 (EN 1998-2: design of structures for earthquake resistance. Part 2: bridges, European Committee for Standardization, Brussels, 2005) suggests seismic regularity criterion for achieving regular seismic performance level at all the bridge columns. The feasibility of adopting these seismic regularity criterions along with those mentioned in literatures will be assessed for bridges designed as per the Indian Standards in the present study.
Mechanics of Granular Materials (MGM) Investigators
NASA Technical Reports Server (NTRS)
2000-01-01
Key persornel in the Mechanics of Granular Materials (MGM) experiment at the University of Colorado at Boulder include Tawnya Ferbiak (software engineer), Susan Batiste (research assistant), and Christina Winkler (graduate research assistant). Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. MGM experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that cannot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: University of Colorado at Boulder).
2000-07-01
Key persornel in the Mechanics of Granular Materials (MGM) experiment at the University of Colorado at Boulder include Tawnya Ferbiak (software engineer), Susan Batiste (research assistant), and Christina Winkler (graduate research assistant). Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. MGM experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that cannot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: University of Colorado at Boulder).
The shear-lag effect of thin-walled box girder under vertical earthquake excitation
NASA Astrophysics Data System (ADS)
Zhai, Zhipeng; Li, Yaozhuang; Guo, Wei
2017-03-01
The variation method based on the energy variation principle is proved to be accurate and valid for analyzing the shear lag effect of box girder under static and dynamic load. Meanwhile, dynamic problems gradually become the key factors in engineering practice. Therefore, a method for calculating the shear lag effect in thin-walled box girder under vertical seismic excitation is proposed by applying Hamilton Principle in this paper. The Timoshenko shear deformation is taken into account. And a new definition of shear lag ratio for box girder is given. What's more, some conclusions are drawn by analysis of numerical example. The results show that small amplitude of earthquake ground motion can generate high stress and obvious shear lag, especially in the region of resonance. And the influence of rotary inertia cannot be ignored for analyzing the shear lag effect. With the increase of span to width ratio, shear lag effect becomes smaller and smaller. These research conclusions will be useful for the engineering practice and enrich the theoretical studies of box girders.
Rotational Seismology: AGU Session, Working Group, and Website
Lee, William H.K.; Igel, Heiner; Todorovska, Maria I.; Evans, John R.
2007-01-01
Introduction Although effects of rotational motions due to earthquakes have long been observed (e. g., Mallet, 1862), nevertheless Richter (1958, p. 213) stated that: 'Perfectly general motion would also involve rotations about three perpendicular axes, and three more instruments for these. Theory indicates, and observation confirms, that such rotations are negligible.' However, Richter provided no references for this claim. Seismology is based primarily on the observation and modeling of three-component translational ground motions. Nevertheless, theoretical seismologists (e.g., Aki and Richards, 1980, 2002) have argued for decades that the rotational part of ground motions should also be recorded. It is well known that standard seismometers are quite sensitive to rotations and therefore subject to rotation-induced errors. The paucity of observations of rotational motions is mainly the result of a lack, until recently, of affordable rotational sensors of sufficient resolution. Nevertheless, in the past decade, a number of authors have reported direct observations of rotational motions and rotations inferred from rigid-body rotations in short baseline accelerometer arrays, creating a burgeoning library of rotational data. For example, ring laser gyros in Germany and New Zealand have led to the first significant and consistent observations of rotational motions from distant earthquakes (Igel et al., 2005, 2007). A monograph on Earthquake Source Asymmetry, Structural Media and Rotation Effects was published recently as well by Teisseyre et al. (2006). Measurement of rotational motions has implications for: (1) recovering the complete ground-displacement history from seismometer recordings; (2) further constraining earthquake rupture properties; (3) extracting information about subsurface properties; and (4) providing additional ground motion information to earthquake engineers for seismic design. A special session on Rotational Motions in Seismology was convened by H. Igel, W.H.K. Lee, and M. Todorovska during the 2006 AGU Fall Meeting. The goal of this session was to discuss rotational sensors, observations, modeling, theoretical aspects, and potential applications of rotational ground motions. The session was accompanied by the inauguration of an International Working Group on Rotational Seismology (IWGoRS) which aims to promote investigations of all aspects of rotational motions in seismology and their implications for related fields such as earthquake engineering, geodesy, strong-motion seismology, and tectonics, as well as to share experience, data, software, and results in an open Web-based environment. The primary goal of this article is to make the Earth Science Community aware of the emergence of the field of rotational seismology.
A risk-mitigation approach to the management of induced seismicity
NASA Astrophysics Data System (ADS)
Bommer, Julian J.; Crowley, Helen; Pinho, Rui
2015-04-01
Earthquakes may be induced by a wide range of anthropogenic activities such as mining, fluid injection and extraction, and hydraulic fracturing. In recent years, the increased occurrence of induced seismicity and the impact of some of these earthquakes on the built environment have heightened both public concern and regulatory scrutiny, motivating the need for a framework for the management of induced seismicity. Efforts to develop systems to enable control of seismicity have not yet resulted in solutions that can be applied with confidence in most cases. The more rational approach proposed herein is based on applying the same risk quantification and mitigation measures that are applied to the hazard from natural seismicity. This framework allows informed decision-making regarding the conduct of anthropogenic activities that may cause earthquakes. The consequent risk, if related to non-structural damage (when re-location is not an option), can be addressed by appropriate financial compensation. If the risk poses a threat to life and limb, then it may be reduced through the application of strengthening measures in the built environment—the cost of which can be balanced against the economic benefits of the activity in question—rather than attempting to ensure that some threshold on earthquake magnitude or ground-shaking amplitude is not exceeded. However, because of the specific characteristics of induced earthquakes—which may occur in regions with little or no natural seismicity—the procedures used in standard earthquake engineering need adaptation and modification for application to induced seismicity.
Response of two identical seven-story structures to the San Fernando earthquake of February 9, 1971
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freeman, S.A.; Honda, K.K.
1973-10-01
The results of the structural dynamic investigation of two sevenstory reinforced concrete frame structures are presented here. The structures are both Holiday Inn rnotor hotels that are essentially identical: one is locrted about 13 miles and the other about 26 miles from the epicenter of the February 9, 1971, San Fernando earthquake. Appreciable nonstructural damage as well as some structural damage was observed. Strong-motion seismic records were obtained for the roof, intermediate story, and ground floor of each structure. The analyses are based on data from the structural drawings, architectural drawings, photographs, engineering reports, and seisrnogram records obtained before, during,more » and after the San Fernando earthquake. Both structures experienced motion well beyond the limits of the building code design criteria. A change in fundamental period was observed for each structure after several seconds of response to the earthquake, which indicated nonlinear response. The analyses indicated that the elastic capacity of some structural members was exceeded. Idealized linear models were constructed to approximate response at various time segments. A method for approximating the nonlinear response of each structure is presented. The effects of nonstructural elements, yielding beams, and column capacities are illustrated. Comparisons of the two buildings are made for ductility factors, dynarnic response characteristics, and damage. Conclusions are drawn concerning the effects of the earthquake on the structures and the future capacities of the structures. (auth)« less
The Great California ShakeOut: Science-Based Preparedness Advocacy
NASA Astrophysics Data System (ADS)
Benthien, M. L.
2009-12-01
The Great Southern California ShakeOut in November 2008 was the largest earthquake drill in U.S. history, involving over 5 million southern Californians through a broad-based outreach program, media partnerships, and public advocacy by hundreds of partners. The basis of the drill was a comprehensive scenario for a magnitude 7.8 earthquake on the southern San Andreas fault, which would cause broad devastation. In early 2009 the decision was made to hold the drill statewide on the third Thursday of October each year (October 15 in 2009). Results of the 2008 and 2009 drills will be shared in this session. In addition, prospects of early warning systems will be described, that will one day provide the needed seconds before strong shaking arrives in which critical systems and be shut down, and people can do what they've been practicing in the ShakeOut drills: drop, cover, and hold on. A key aspect of the ShakeOut is the integration of a comprehensive earthquake scenario (incorporating earth science, engineering, policy, economics, public health, and other disciplines) and the lessons learned from decades of social science research about why people get prepared. The result is a “teachable moment” on par with having an actual earthquake (often followed by increased interest in getting ready for earthquakes). ShakeOut creates the sense of urgency that is needed for people, organizations, and communities to get prepared, to practice what to do to be safe, and to learn what plans need to be improved.
Seismic response analysis of a 13-story steel moment-framed building in Alhambra, California
Rodgers, Janise E.; Sanli, Ahmet K.; Çelebi, Mehmet
2004-01-01
The seismic performance of steel moment-framed buildings has been of particular interest since brittle fractures were discovered at the beam-column connections of some frames following the M6.7 1994 Northridge earthquake. This report presents an investigation of the seismic behavior of an instrumented 13-story steel moment frame building located in the greater Los Angeles area of California. An extensive strong motion dataset, ambient vibration data, engineering drawings and earthquake damage reports are available for this building. The data are described and subsequently analyzed. The results of the analyses show that the building response is more complex than would be expected from its highly symmetrical geometry. The building's response is characterized by low damping in the fundamental mode, larger peak accelerations in the intermediate stories than at the roof, extended periods of vibration after the cessation of strong input shaking, beating in the response, and significant torsion during strong shaking at the top of the concrete piers which extend from the basement to the second floor. The analyses of the data and all damage detection methods employed except one method based on system identification indicate that the response of the structure was elastic in all recorded earthquakes. These findings are in general agreement with the results of intrusive inspections (meaning fireproofing and architectural finishes were removed) conducted on approximately 5 percent of the moment connections following the Northridge earthquake, which found no earthquake damage.
Feasibility study of the seismic reflection method in Amargosa Desert, Nye County, Nevada
Brocher, T.M.; Hart, P.E.; Carle, S.F.
1990-01-01
The seismic performance of steel moment-framed buildings has been of particular interest since brittle fractures were discovered at the beam-column connections of some frames following the M6.7 1994 Northridge earthquake. This report presents an investigation of the seismic behavior of an instrumented 13-story steel moment frame building located in the greater Los Angeles area of California. An extensive strong motion dataset, ambient vibration data, engineering drawings and earthquake damage reports are available for this building. The data are described and subsequently analyzed. The results of the analyses show that the building response is more complex than would be expected from its highly symmetrical geometry. The building's response is characterized by low damping in the fundamental mode, larger peak accelerations in the intermediate stories than at the roof, extended periods of vibration after the cessation of strong input shaking, beating in the response, and significant torsion during strong shaking at the top of the concrete piers which extend from the basement to the second floor. The analyses of the data and all damage detection methods employed except one method based on system identification indicate that the response of the structure was elastic in all recorded earthquakes. These findings are in general agreement with the results of intrusive inspections (meaning fireproofing and architectural finishes were removed) conducted on approximately 5 percent of the moment connections following the Northridge earthquake, which found no earthquake damage.
Cohen, Rebecca; Weinisch, Kevin
2015-01-01
United States regulations require nuclear power plants (NPPs) to estimate the time needed to evacuate the emergency planning zone (EPZ, a circle with an approximate 10-mile radius centered at the NPP). These evacuation time estimate (ETE) studies are to be used by emergency personnel in the event of a radiological emergency. ETE studies are typically done using traffic simulation and evacuation models, based on traffic engineering algorithms that reflect congestion and delay. ETE studies are typically conducted assuming all evacuation routes are traversable. As witnessed in the Great East Japan Earthquake in March 2011, an earthquake and the ensuing tsunami can cause an incident at a NPP that requires an evacuation of the public. The earthquake and tsunami can also damage many of the available bridges and roadways and, therefore, impede evacuation and put people at risk of radiation exposure. This article presents a procedure, using traffic simulation and evacuation models, to estimate the impact on ETE due to bridge and roadway damage caused by a major earthquake, or similar hazardous event. The results of this analysis are used by emergency personnel to make protective action decisions that will minimize the exposure of radiation to the public. Additionally, the results allow emergency planners to ensure proper equipment and personnel are available for these types of events. Emergency plans are revised to ensure prompt response and recovery action during critical times.
NASA Astrophysics Data System (ADS)
Sharifi Mood, M.; Olsen, M. J.; Gillins, D. T.; Javadnejad, F.
2016-12-01
The Cascadia Subduction Zone (CSZ) has the ability to generate earthquake as powerful as 9 moment magnitude creating great amount of damage to structures and facilities in Oregon. Series of deterministic earthquake analysis are performed for M9.0, M8.7, M8.4 and M8.1 presenting persistent, long lasting shaking associated with other geological threats such as ground shaking, landslides, liquefaction-induced ground deformations, fault rupture vertical displacement, tsunamis, etc. These ground deformation endangers urban structures, foundations, bridges, roadways, pipelines and other lifelines. Lifeline providers in Oregon, including private and public practices responsible for transportation, electric and gas utilities, water and wastewater, fuel, airports, and harbors face an aging infrastructure that was built prior to a full understanding of this extreme seismic risk. As recently experienced in Chile and Japan, a three to five minutes long earthquake scenario, expected in Oregon, necessities a whole different method of risk mitigation for these major lifelines than those created for shorter shakings from crustal earthquakes. A web-based geographic information system tool is developed to fully assess the potential hazard from the multiple threats impending from Cascadia subduction zone earthquakes in the region. The purpose of this website is to provide easy access to the latest and best available hazard information over the web, including work completed in the recent Oregon Resilience Plan (ORP) (OSSPAC, 2013) and other work completed by the Department of Geology and Mineral Industries (DOGAMI) and the United States Geological Survey (USGS). As a result, this tool is designated for engineers, planners, geologists, and others who need this information to help make appropriate decisions despite the fact that this web-GIS tool only needs minimal knowledge of GIS to work with.
Reflections from the interface between seismological research and earthquake risk reduction
NASA Astrophysics Data System (ADS)
Sargeant, S.
2012-04-01
Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the implications for scientists and information delivery.
NASA Astrophysics Data System (ADS)
Bora, Kritika; Pande, Ravindra K.
2017-07-01
"Earthquake does not kill people; it is the building which kills people". Earthquake is a sudden event below the surface of the earth which results in vertical and horizontal waves that causes destruction. The main aim of this research is to bring into light the unplanned and non-engineered construction practices growing in the Urban areas. Lack of space and continuous migration from hills has resulted in Multistorey construction. The present study is based on primary data collection through Rapid Visual Screening for the assessment of vulnerability of buildings. "Haldwani-Kathgodam being a new Municipal Corporation located in the foot hills of Himalayas is facing same problem. The seismic zonation brings this area into zone 4 of damage risk. Therefore an assessment to estimate the risk of the built up environment is important. This paper presents a systematic and useful way of assessing physical vulnerability of buildings. The present paper will show how the growing pressure on urban area tends to make the built up environment vulnerable towards seismic activities. The challenge today is to make our living environment safe for living. The day by day growing population pressure on urban area as a migration trend in developing countries is leading to high rise building, no planning and reckless construction. For the sake of saving some money people usually do not take the approval from structural engineer. This unplanned and haphazard construction proves non-resistant towards earthquake and brings lives and properties to death and a stand still. The total no. of household in the current study area is 543 whereas the total population is 2497 (2011). The recent formation of Himalayas makes the area more sensitive towards seismic event. The closeness to the Main Boundary thrust brings it to zone 4 in the Seismic Zonation of India i.e., High Damage Risk Zone
Special Issue "Natural Hazards' Impact on Urban Areas and Infrastructure" in Natural Hazards
NASA Astrophysics Data System (ADS)
Bostenaru Dan, M.
2009-04-01
In 2006 and 2007, at the 3rd and 4th General Assembly of the European Geosciences Union respectivelly, the session on "Natural Hazards' Impact on Urban Areas and Infrastructure" was convened by Maria Bostenaru Dan, then at the Istituto Universitario di Studi Superiori di Pavia, ROSE School, Italy, who conducts research on earthquake management and Heidi Kreibich from the GFZ Potsdam, Germany, who conducts research on flood hazards, in 2007 being co-convened also by Agostino Goretti from the Civil Protection in Rome, Italy. The session initially started from an idea of Friedemann Wenzel from the Universität Karlsruhe (TH), Germany, the former speaker of the SFB 461 "Strong earthquakes", the university where also Maria Bostenaru graduated and worked and which runs together with the GFZ Potsdam the CEDIM, the Center for Disaster Management and Risk Reduction Technology. Selected papers from these two sessions as well as invited papers from other specialists were gathered for a special issue to be published in the journal "Natural Hazards" under the guest editorship of Heidi Kreibich and Maria Bostenaru Dan. Unlike the former special issue, this one contains a well balanced mixture of many hazards: climate change, floods, mountain hazards like avalanches, volcanoes, earthquakes. Aim of the issue was to enlarge the co-operation prospects between geosciences and other professions in field of natural hazards. Earthquake engineering and engineering seismology are seen more frequently co-operating, but in field of natural hazards there is a need to co-operate with urban planners, and, looking to the future, also in the field of integrated conservation, which implies co-operation between architecture and urban planning for the preservation of our environment. Integrated conservation is stipulated since the 1970s, which are the years when the participatism, and so the involvment of social sciences started.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rohay, Alan C.; Sweeney, Mark D.; Hartshorn, Donald C.
The Hanford Seismic Assessment Program (HSAP) provides an uninterrupted collection of high-quality raw and processed seismic data from the Hanford Seismic Network for the U.S. Department of Energy and its contractors. The HSAP is responsible for locating and identifying sources of seismic activity and monitoring changes in the historical pattern of seismic activity at the Hanford Site. The data are compiled, archived, and published for use by the Hanford Site for waste management, natural phenomena hazards assessments, and engineering design and construction. In addition, the HSAP works with the Hanford Site Emergency Services Organization to provide assistance in the eventmore » of a significant earthquake on the Hanford Site. The Hanford Seismic Network and the Eastern Washington Regional Network consist of 44 individual sensor sites and 15 radio relay sites maintained by the Hanford Seismic Assessment Team. The Hanford Seismic Network recorded 23 local earthquakes during the third quarter of FY 2010. Sixteen earthquakes were located at shallow depths (less than 4 km), five earthquakes at intermediate depths (between 4 and 9 km), most likely in the pre-basalt sediments, and two earthquakes were located at depths greater than 9 km, within the basement. Geographically, twelve earthquakes were located in known swarm areas, 3 earthquakes occurred near a geologic structure (Saddle Mountain anticline), and eight earthquakes were classified as random events. The highest magnitude event (3.0 Mc) was recorded on May 8, 2010 at depth 3.0 km with epicenter located near the Saddle Mountain anticline. Later in the quarter (May 24 and June 28) two additional earthquakes were also recorded nearly at the same location. These events are not considered unusual in that earthquakes have been previously recorded at this location, for example, in October 2006 (Rohay et al; 2007). Six earthquakes were detected in the vicinity of Wooded Island, located about eight miles north of Richland just west of the Columbia River. The Wooded Island events recorded this quarter were a continuation of the swarm events observed during the 2009 and 2010 fiscal years and reported in previous quarterly and annual reports (Rohay et al; 2009a, 2009b, 2009c, 2010a, and 2010b). All events were considered minor (coda-length magnitude [Mc] less than 1.0) with a maximum depth estimated at 1.7 km. Based upon this quarters activity it is likely that the Wooded Island swarm has subsided. Pacific Northwest National Laboratory (PNNL) will continue to monitor for activity at this location.« less
Permeability, storage and hydraulic diffusivity controlled by earthquakes
NASA Astrophysics Data System (ADS)
Brodsky, E. E.; Fulton, P. M.; Xue, L.
2016-12-01
Earthquakes can increase permeability in fractured rocks. In the farfield, such permeability increases are attributed to seismic waves and can last for months after the initial earthquake. Laboratory studies suggest that unclogging of fractures by the transient flow driven by seismic waves is a viable mechanism. These dynamic permeability increases may contribute to permeability enhancement in the seismic clouds accompanying hydraulic fracking. Permeability enhancement by seismic waves could potentially be engineered and the experiments suggest the process will be most effective at a preferred frequency. We have recently observed similar processes inside active fault zones after major earthquakes. A borehole observatory in the fault that generated the M9.0 2011 Tohoku earthquake reveals a sequence of temperature pulses during the secondary aftershock sequence of an M7.3 aftershock. The pulses are attributed to fluid advection by a flow through a zone of transiently increased permeability. Directly after the M7.3 earthquake, the newly damaged fault zone is highly susceptible to further permeability enhancement, but ultimately heals within a month and becomes no longer as sensitive. The observation suggests that the newly damaged fault zone is more prone to fluid pulsing than would be expected based on the long-term permeability structure. Even longer term healing is seen inside the fault zone of the 2008 M7.9 Wenchuan earthquake. The competition between damage and healing (or clogging and unclogging) results in dynamically controlled permeability, storage and hydraulic diffusivity. Recent measurements of in situ fault zone architecture at the 1-10 meter scale suggest that active fault zones often have hydraulic diffusivities near 10-2 m2/s. This uniformity is true even within the damage zone of the San Andreas fault where permeability and storage increases balance each other to achieve this value of diffusivity over a 400 m wide region. We speculate that fault zones may evolve to a preferred diffusivity in a dynamic equilibrium.
Regional Moment Tensor Analysis of Earthquakes in Iran for 2010 to 2017 Using In-Country Data
NASA Astrophysics Data System (ADS)
Graybeal, D.; Braunmiller, J.
2017-12-01
Located in the middle of the Arabia-Eurasia continental collision, Iran is one of the most tectonically diverse and seismically active countries in the world. Until recently, however, seismic source parameter studies had to rely on teleseismic data or on data from temporary local arrays, which limited the scope of investigations. Relatively new broadband seismic networks operated by the Iranian Institute of Engineering Seismology (IIEES) and the Iranian Seismological Center (IRSC) currently consist of more than 100 stations and allow, for the first time, routine three-component full-waveform regional moment tensor analysis of the numerous M≥4.0 earthquakes that occur throughout the country. We use openly available, in-country data and include data from nearby permanent broadband stations available through IRIS and EIDA to improve azimuthal coverage for events in border regions. For the period from 2010 to 2017, we have obtained about 500 moment tensors for earthquakes ranging from Mw=3.6 to 7.8. The resulting database provides a unique, detailed view of deformation styles and earthquake depths in Iran. Overall, we find mainly thrust and strike-slip mechanisms as expected considering the convergent tectonic setting. Our magnitudes (Mw) are slightly smaller than ML and mb but comparable to Mw as reported in global catalogs (USGS ANSS). Event depths average about 3 km shallower than in global catalogs and are well constrained considering the capability of regional waveforms to resolve earthquake depth. Our dataset also contains several large magnitude main shock-aftershock sequences from different tectonic provinces, including the 2012 Ahar-Varzeghan (Mw=6.4), 2013 Kaki (Mw=6.5), and 2014 Murmuri (Mw=6.2) earthquakes. The most significant result in terms of seismogenesis and seismic hazard is that the vast majority of earthquakes occur at shallow depth, not in deeper basement. Our findings indicate that more than 80% of crustal seismicity in Iran likely occurs at depths of 12 km or less.
Shallow P- and S-wave velocities and site resonances in the St. Louis region, Missouri-Illinois
Williams, R.A.; Odum, J.K.; Stephenson, W.J.; Herrmann, Robert B.
2007-01-01
As part of the seismic hazard-mapping efforts in the St. Louis metropolitan area we determined the compressional and shear-wave velocities (Vp and Vs) to about a 40-m depth at 17 locations in this area. The Vs measurements were made using high-resolution seismic refraction and reflection methods. We find a clear difference in the Vs profiles between sites located on the river floodplains and those located in the upland urban areas of St. Louis. Vs30 (average Vs to 30-m depth) values in floodplain areas range from 200 to 290 m/s (NEHRP category D) and contrast with sites on the upland areas of St. Louis, which have Vs30 values ranging from 410 to 785 m/s (NEHRP categories C and B). The lower Vs30 values and earthquake recordings in the floodplains suggest a greater potential for stronger and more prolonged ground shaking in an earthquake. Spectral analysis of a M3.6 earthquake recorded on the St. Louis-area ANSS seismograph network indicates stronger shaking and potentially damaging S-wave resonant frequencies at NEHRP category D sites compared to ground motions at a rock site located on the Saint Louis University campus. ?? 2007, Earthquake Engineering Research Institute.
Boore, David M.
1999-01-01
Displacements derived from the accelerogram recordings of the 1999 Chi-Chi, Taiwan earthquake at stations TCU078 and TCU129 show drifts when only a simple baseline derived from the pre-event portion of the record is removed from the records. The appearance of the velocity and displacement records suggests that changes in the zero-level of the acceleration are responsible for these drifts. The source of the shifts in zero-level are unknown, but might include tilts in the instruments or the response of the instruments to strong shaking. This note illustrates the effect on the velocity, displacement, and response spectra of several schemes for accounting for these baseline shifts. The most important conclusion for earthquake engineering purposes is that the response spectra for periods less than about 20 sec are unaffected by the baseline correction. The results suggest, however, that staticdisplac ements estimated from the instruments should be used with caution. Although limited to the analysis of only two recordings, the results may have more general significance both for the many other recordings of this earthquake and for data that will be obtained in the future from similar high-quality accelerograph networks now being installed or soon to be installed in many parts of the world.
Boore, D.M.; Stephens, C.D.; Joyner, W.B.
2002-01-01
Residual displacements for large earthquakes can sometimes be determined from recordings on modern digital instruments, but baseline offsets of unknown origin make it difficult in many cases to do so. To recover the residual displacement, we suggest tailoring a correction scheme by studying the character of the velocity obtained by integration of zeroth-order-corrected acceleration and then seeing if the residual displacements are stable when the various parameters in the particular correction scheme are varied. For many seismological and engineering purposes, however, the residual displacement are of lesser importance than ground motions at periods less than about 20 sec. These ground motions are often recoverable with simple baseline correction and low-cut filtering. In this largely empirical study, we illustrate the consequences of various correction schemes, drawing primarily from digital recordings of the 1999 Hector Mine, California, earthquake. We show that with simple processing the displacement waveforms for this event are very similar for stations separated by as much as 20 km. We also show that a strong pulse on the transverse component was radiated from the Hector Mine earthquake and propagated with little distortion to distances exceeding 170 km; this pulse leads to large response spectral amplitudes around 10 sec.
Recordings from the deepest borehole in the New Madrid Seismic Zone
Wang, Z.; Woolery, E.W.
2006-01-01
The recordings at the deepest vertical strong-motion array (VSAS) from three small events, the 21 October 2004 Tiptonville, Tennessee, earthquake; the 10 February 2005 Arkansas earthquake; and the 2 June 2005 Ridgely, Tennessee, earthquake show some interesting wave-propagation phenomena through the soils: the S-wave is attenuated from 260 m to 30 m depth and amplified from 30 m to the surface. The S-wave arrival times from the three events yielded different shear-wave velocity estimates for the soils. These different estimates may be the result of different incident angles of the S-waves due to different epicentral distances. The epicentral distances are about 22 km, 110 km, and 47 km for the Tiptonville, Arkansas, and Ridgely earthquakes, respectively. These recordings show the usefulness of the borehole strong-motion array. The vertical strong-motion arrays operated by the University of Kentucky have started to accumulate recordings that will provide a database for scientists and engineers to study the effects of the near-surface soils on the strong ground motion in the New Madrid Seismic Zone. More information about the Kentucky Seismic and Strong-Motion Network can be found at www.uky.edu/KGS/geologichazards. The digital recordings are available at ftp://kgsweb.uky.edu.
NASA Astrophysics Data System (ADS)
Benfedda, A.; Abbes, K.; Bouziane, D.; Bouhadad, Y.; Slimani, A.; Larbes, S.; Haddouche, D.; Bezzeghoud, M.
2017-03-01
On August 1st, 2014, a moderate-sized earthquake struck the capital city of Algiers at 05:11:17.6 (GMT+1). The earthquake caused the death of six peoples and injured 420, mainly following a panic movement among the population. Following the main shock, we surveyed the aftershock activity using a portable seismological network (short period), installed from August 2nd, 2014 to August 21st, 2015. In this work, first, we determined the main shock epicenter using the accelerograms recorded by the Algerian accelerograph network (under the coordination of the National Center of Applied Research in Earthquake Engineering-CGS). We calculated the focal mechanism of the main shock, using the inversion of the accelerograph waveforms in displacement that provides a reverse fault with a slight right-lateral component of slip and a compression axis striking NNW-SSE. The obtained scalar seismic moment ( M o = 1.25 × 1017 Nm) corresponds to a moment magnitude of M w = 5.3. Second, the analysis of the obtained aftershock swarm, of the survey, suggests an offshore ENE-WSW, trending and NNW dipping, causative active fault in the bay of Algiers, which may likely correspond to an offshore unknown segment of the Sahel active fault.
Sand Volcano Following Earthquake
NASA Technical Reports Server (NTRS)
1989-01-01
Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)
Loss estimates for a Puente Hills blind-thrust earthquake in Los Angeles, California
Field, E.H.; Seligson, H.A.; Gupta, N.; Gupta, V.; Jordan, T.H.; Campbell, K.W.
2005-01-01
Based on OpenSHA and HAZUS-MH, we present loss estimates for an earthquake rupture on the recently identified Puente Hills blind-thrust fault beneath Los Angeles. Given a range of possible magnitudes and ground motion models, and presuming a full fault rupture, we estimate the total economic loss to be between $82 and $252 billion. This range is not only considerably higher than a previous estimate of $69 billion, but also implies the event would be the costliest disaster in U.S. history. The analysis has also provided the following predictions: 3,000-18,000 fatalities, 142,000-735,000 displaced households, 42,000-211,000 in need of short-term public shelter, and 30,000-99,000 tons of debris generated. Finally, we show that the choice of ground motion model can be more influential than the earthquake magnitude, and that reducing this epistemic uncertainty (e.g., via model improvement and/or rejection) could reduce the uncertainty of the loss estimates by up to a factor of two. We note that a full Puente Hills fault rupture is a rare event (once every ???3,000 years), and that other seismic sources pose significant risk as well. ?? 2005, Earthquake Engineering Research Institute.
1989-10-17
Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey
Rumours about the Po Valley earthquakes of 20th and 29th May 2012
NASA Astrophysics Data System (ADS)
La Longa, Federica; Crescimbene, Massimo; Camassi, Romano; Nostro, Concetta
2013-04-01
The history of rumours is as old as human history. Even in remote antiquity, rumours, gossip and hoax were always in circulation - in good or bad faith - to influence human affairs. Today with the development of mass media, rise of the internet and social networks, rumours are ubiquitous. The earthquakes, because of their characteristics of strong emotional impact and unpredictability, are among the natural events that more cause the birth and the spread of rumours. For this reason earthquakes that occurred in the Po valley the 20th and 29th May 2012 generated and still continue to generate a wide variety of rumours regarding issues related to the earthquake, its effects, the possible causes, future predictions. For this reason, as occurred during the L'Aquila earthquake sequence in 2009, following the events of May 2012 in Emilia Romagna was created a complex initiative training and information that at various stages between May and September 2012, involved population, partly present in the camp, and then the school staff of the municipalities affected by the earthquake. This experience has been organized and managed by the Department of Civil Protection (DPC), the National Institute of Geophysics and Volcanology (INGV), the Emilia Romagna region in collaboration with the Network of University Laboratories for Earthquake Engineering (RELUIS), the Health Service Emilia Romagna Regional and voluntary organizations of civil protection in the area. Within this initiative, in the period June-September 2012 were collected and catalogued over 240 rumours. In this work rumours of the Po Valley are studied in their specific characteristics and strategies and methods to fight them are also discussed. This work of collection and discussion of the rumours was particularly important to promote good communication strategies and to fight the spreading of the rumours. Only in this way it was possible to create a full intervention able to supporting both the local institutions and the individuals involved to adequately address the emergence and management of organizational problems and social issues related to the earthquake.
Preparing a population for an earthquake like Chi-Chi: The Great Southern California ShakeOut
Jones, Lucile M.; ,
2009-01-01
The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. On November 13, 2008, over 5 million southern Californians pretended that a magnitude-7.8 earthquake had occurred and practiced actions that could reduce its impact on their lives. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The drill was based on a scenario of the impacts and consequences of such an earthquake on the Southern San Andreas Fault, developed by over 300 experts led by the U.S. Geological Survey in partnership with the California Geological Survey, the Southern California Earthquake Center, Earthquake Engineering Research Institute, lifeline operators, emergency services and many other organizations. The ShakeOut campaign was designed and implemented by earthquake scientists, emergency managers, sociologists, art designers and community participants. The means of communication were developed using results from sociological research on what encouraged people to take action. This was structured around four objectives: 1) consistent messages – people are more inclined to believe something when they hear the same thing from multiple sources; 2) visual reinforcement – people are more inclined to do something they see other people doing; 3) encourage “milling” or discussing contemplated action – people need to discuss an action with others they care about before committing to undertaking it; and 4) focus on concrete actions – people are more likely to prepare for a set of concrete consequences of a particular hazard than for an abstract concept of risk. The goals of the ShakeOut were established in Spring 2008 and were: 1) to register 5 million people to participate in the drill; 2) to change the culture of earthquake preparedness in southern California; and 3) to reduce earthquake losses in southern California. All of these goals were met. The final registration at www.shakeout.org for the 2008 ShakeOut was 5.47 million people, or one-quarter of the population of the region. A survey conducted with the registered participants showed that the messages they took from the ShakeOut were the concepts intended, including the importance of “Drop, Cover, Hold On”, the interdependency of earthquake risk (“We are all in this together”) and the possibility of reducing losses through preparation and mitigation. Sales data from the Home Depot hardware stores in southern California showed a 260% increase in the sale of earthquake safety products during the month of the ShakeOut, November 2008.
NASA Astrophysics Data System (ADS)
Motosaka, M.
2009-12-01
This paper presents firstly, the development of an integrated regional earthquake early warning (EEW) system having on-line structural health monitoring (SHM) function, in Miyagi prefecture, Japan. The system makes it possible to provide more accurate, reliable and immediate earthquake information for society by combining the national (JMA/NIED) EEW system, based on advanced real-time communication technology. The author has planned to install the EEW/SHM system to the public buildings around Sendai, a million city of north-eastern Japan. The system has been so far implemented in two buildings; one is in Sendai, and the other in Oshika, a front site on the Pacific Ocean coast for the approaching Miyagi-ken Oki earthquake. The data from the front-site and the on-site are processed by the analysis system which was installed at the analysis center of Disaster Control Research Center, Tohoku University. The real-time earthquake information from JMA is also received at the analysis center. The utilization of the integrated EEW/SHM system is addressed together with future perspectives. Examples of the obtained data are also described including the amplitude depending dynamic characteristics of the building in Sendai before, during, and after the 2008/6/14 Iwate-Miyagi Nairiku Earthquake, together with the historical change of dynamic characteristics for 40 years. Secondary, this paper presents an advanced methodology based on Artificial Neural Networks (ANN) for forward forecasting of ground motion parameters, not only PGA, PGV, but also Spectral information before S-wave arrival using initial part of P-waveform at a front site. The estimated ground motion information can be used as warning alarm for earthquake damage reduction. The Fourier Amplitude Spectra (FAS) estimated before strong shaking with high accuracy can be used for advanced engineering applications, e.g. feed-forward structural control of a building of interest. The validity and applicability of the method have been verified by using observation data sets of the K-NET sites of 39 earthquakes occurred in Miyagi Oki area. The initial part of P waveform data at the Oshika site (MYG011) of K-NET were used as the front-site waveform data. The earthquake observation data for 35 earthquakes among the 39 earthquakes, as well as the positional-information and site repartition information, were used as training data to construct the ANN structure. The data set for the remaining 4 earthquakes were used as the test data in the blind prediction of PGA and PGV at the 4 sites, namely, Sendai (MYG013), Taiwa (MYG009), Shiogama (MYG012), and Ishinomaki (MYG010).
Suitability of rapid energy magnitude determinations for emergency response purposes
NASA Astrophysics Data System (ADS)
Di Giacomo, Domenico; Parolai, Stefano; Bormann, Peter; Grosser, Helmut; Saul, Joachim; Wang, Rongjiang; Zschau, Jochen
2010-01-01
It is common practice in the seismological community to use, especially for large earthquakes, the moment magnitude Mw as a unique magnitude parameter to evaluate the earthquake's damage potential. However, as a static measure of earthquake size, Mw does not provide direct information about the released seismic wave energy and its high frequency content, which is the more interesting information both for engineering purposes and for a rapid assessment of the earthquake's shaking potential. Therefore, we recommend to provide to disaster management organizations besides Mw also sufficiently accurate energy magnitude determinations as soon as possible after large earthquakes. We developed and extensively tested a rapid method for calculating the energy magnitude Me within about 10-15 min after an earthquake's occurrence. The method is based on pre-calculated spectral amplitude decay functions obtained from numerical simulations of Green's functions. After empirical validation, the procedure has been applied offline to a large data set of 767 shallow earthquakes that have been grouped according to their type of mechanism (strike-slip, normal faulting, thrust faulting, etc.). The suitability of the proposed approach is discussed by comparing our rapid Me estimates with Mw published by GCMT as well as with Mw and Me reported by the USGS. Mw is on average slightly larger than our Me for all types of mechanisms. No clear dependence on source mechanism is observed for our Me estimates. In contrast, Me from the USGS is generally larger than Mw for strike-slip earthquakes and generally smaller for the other source types. For ~67 per cent of the event data set our Me differs <= +/-0.3 magnitude units (m.u.) from the respective Me values published by the USGS. However, larger discrepancies (up to 0.8 m.u.) may occur for strike-slip events. A reason of that may be the overcorrection of the energy flux applied by the USGS for this type of earthquakes. We follow the original definition of magnitude scales, which does not apply a priori mechanism corrections to measured amplitudes, also since reliable fault-plane solutions are hardly available within 10-15 min after the earthquake origin time. Notable is that our uncorrected Me data show a better linear correlation and less scatter with respect to Mw than Me of the USGS. Finally, by analysing the recordings of representative recent pairs of strong and great earthquakes, we emphasize the importance of combining Mw and Me in the rapid characterization of the seismic source. They are related to different aspects of the source and may differ occasionally even more than 1 m.u. This highlights the usefulness and importance of providing these two magnitude estimates together for a better assessment of an earthquake's shaking potential and/or tsunamigenic potential.
Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.
2016-04-01
Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition, the flexibility of NDSHA allows for generation of ground shaking maps at specified long-term return times, which may permit a straightforward comparison between NDSHA and PSHA maps in terms of average rates of exceedance for specified time windows. The comparison of NDSHA and PSHA maps, particularly for very long recurrence times, may indicate to what extent probabilistic ground shaking estimates are consistent with those from physical models of seismic waves propagation. A systematic comparison over the territory of Italy is carried out exploiting the uniqueness of the Italian earthquake catalogue, a data set covering more than a millennium (a time interval about ten times longer than that available in most of the regions worldwide) with a satisfactory completeness level for M>5, which warrants the results of analysis. By analysing in some detail seismicity in the Vrancea region, we show that well constrained macroseismic field information for individual earthquakes may provide useful information about the reliability of ground shaking estimates. Finally, in order to generalise observations, the comparative analysis is extended to further regions where both standard NDSHA and PSHA maps are available (e.g. State of Gujarat, India). The final Global Seismic Hazard Assessment Program (GSHAP) results and the most recent version of Seismic Hazard Harmonization in Europe (SHARE) project maps, along with other national scale probabilistic maps, all obtained by PSHA, are considered for this comparative analysis.
NASA Astrophysics Data System (ADS)
Cramer, C. H.; Kutliroff, J.; Dangkua, D.
2010-12-01
A five-year Next Generation Attenuation (NGA) East project to develop new ground motion prediction equations for stable continental regions (SCRs), including eastern North America (ENA), has begun at the Pacific Earthquake Engineering Research (PEER) Center funded by the Nuclear Regulatory Commission (NRC), the U.S. Geological Survey (USGS), the Electric Power Research Institute (EPRI), and the Department of Energy (DOE). The initial effort focused on database design and collection of appropriate M>4 ENA broadband and accelerograph records to populate the database. Ongoing work has focused on adding records from smaller ENA earthquakes and from other SCRs such as Europe, Australia, and India. Currently, over 6500 horizontal and vertical component records from 60 ENA earthquakes have been collected and prepared (instrument response removed, filtering to acceptable-signal band, determining peak and spectral parameter values, quality assurance, etc.) for the database. Geologic Survey of Canada (GSC) strong motion recordings, previously not available, have also been added to the NGA East database. The additional earthquakes increase the number of ground motion recordings in the 10 - 100 km range, particularly from the 2008 M5.2 Mt. Carmel, IL event, and the 2005 M4.7 Riviere du Loup and 2010 M5.0 Val des Bois earthquakes in Quebec, Canada. The goal is to complete the ENA database and make it available in 2011 followed by a SCR database in 2012. Comparisons of ground motion observations from four recent M5 ENA earthquakes with current ENA ground motion prediction equations (GMPEs) suggest that current GMPEs, as a group, reasonably agree with M5 observations at short periods, particularly at distances less than 200 km. However, at one second, current GMPEs over predict M5 ground motion observations. The 2001 M7.6 Bhuj, India, earthquake provides some constraint at large magnitudes, as geology and regional attenuation is analogous to ENA. Cramer and Kumar, 2003, have shown that ENA GMPE’s generally agree with the Bhuj dataset within 300 km at short and long periods. But the Bhuj earthquake does not exhibit the intermediate-period spectral sag (Atkinson, 1993) of larger ENA earthquakes and thus the Bhuj ground motions may be larger than what could be expected at one second for M7s in ENA.
Seismic hazard map of the western hemisphere
Shedlock, K.M.; Tanner, J.G.
1999-01-01
Vulnerability to natural disasters increases with urbanization and development of associated support systems (reservoirs, power plants, etc.). Catastrophic earthquakes account for 60% of worldwide casualties associated with natural disasters. Economic damage from earthquakes is increasing, even in technologically advanced countries with some level of seismic zonation, as shown by the 1989 Loma Prieta, CA ($6 billion), 1994 Northridge, CA ($ 25 billion), and 1995 Kobe, Japan (> $ 100 billion) earthquakes. The growth of megacities in seismically active regions around the world often includes the construction of seismically unsafe buildings and infrastructures, due to an insufficient knowledge of existing seismic hazard. Minimization of the loss of life, property damage, and social and economic disruption due to earthquakes depends on reliable estimates of seismic hazard. National, state, and local governments, decision makers, engineers, planners, emergency response organizations, builders, universities, and the general public require seismic hazard estimates for land use planning, improved building design and construction (including adoption of building construction codes), emergency response preparedness plans, economic forecasts, housing and employment decisions, and many more types of risk mitigation. The seismic hazard map of the Americas is the concatenation of various national and regional maps, involving a suite of approaches. The combined maps and documentation provide a useful global seismic hazard framework and serve as a resource for any national or regional agency for further detailed studies applicable to their needs. This seismic hazard map depicts Peak Ground Acceleration (PGA) with a 10% chance of exceedance in 50 years for the western hemisphere. PGA, a short-period ground motion parameter that is proportional to force, is the most commonly mapped ground motion parameter because current building codes that include seismic provisions specify the horizontal force a building should be able to withstand during an earthquake. This seismic hazard map of the Americas depicts the likely level of short-period ground motion from earthquakes in a fifty-year window. Short-period ground motions effect short-period structures (e.g., one-to-two story buildings). The largest seismic hazard values in the western hemisphere generally occur in areas that have been, or are likely to be, the sites of the largest plate boundary earthquakes. Although the largest earthquakes ever recorded are the 1960 Chile and 1964 Alaska subduction zone earthquakes, the largest seismic hazard (PGA) value in the Americas is in Southern California (U.S.), along the San Andreas fault.
Missair, Andres; Pretto, Ernesto A; Visan, Alexandru; Lobo, Laila; Paula, Frank; Castillo-Pedraza, Catalina; Cooper, Lebron; Gebhard, Ralf E
2013-10-01
All modalities of anesthetic care, including conscious sedation, general, and regional anesthesia, have been used to manage earthquake survivors who require urgent surgical intervention during the acute phase of medical relief. Consequently, we felt that a review of epidemiologic data from major earthquakes in the context of urgent intraoperative management was warranted to optimize anesthesia disaster preparedness for future medical relief operations. The primary outcome measure of this study was to identify the predominant preoperative injury pattern (anatomic location and pathology) of survivors presenting for surgical care immediately after major earthquakes during the acute phase of medical relief (0-15 days after disaster). The injury pattern is of significant relevance because it closely relates to the anesthetic techniques available for patient management. We discuss our findings in the context of evidence-based strategies for anesthetic management during the acute phase of medical relief after major earthquakes and the associated obstacles of devastated medical infrastructure. To identify reports on acute medical care in the aftermath of natural disasters, a query was conducted using MEDLINE/PubMed, Embase, CINAHL, as well as an online search engine (Google Scholar). The search terms were "disaster" and "earthquake" in combination with "injury," "trauma," "surgery," "anesthesia," and "wounds." Our investigation focused only on studies of acute traumatic injury that specified surgical intervention among survivors in the acute phase of medical relief. A total of 31 articles reporting on 15 major earthquakes (between 1980 and 2010) and the treatment of more than 33,410 patients met our specific inclusion criteria. The mean incidence of traumatic limb injury per major earthquake was 68.0%. The global incidence of traumatic limb injury was 54.3% (18,144/33,410 patients). The pooled estimate of the proportion of limb injuries was calculated to be 67.95%, with a 95% confidence interval of 62.32% to 73.58%. Based on this analysis, early disaster surgical intervention will focus on surviving patients with limb injury. All anesthetic techniques have been safely used for medical relief. While regional anesthesia may be an intuitive choice based on these findings, in the context of collapsed medical infrastructure, provider experience may dictate the available anesthetic techniques for earthquake survivors requiring urgent surgery.
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2010 CFR
2010-01-01
... under part 50, or a design certification, combined license, design approval, or manufacturing license... license, design approval, or manufacturing license is required by §§ 50.34(a)(12), 50.34(b)(10), or 10 CFR... design for a nuclear power facility. Manufacturing license means a license, issued under subpart F of...
NGA East | Pacific Earthquake Engineering Research Center (PEER)
the Geotechnical and Vertical WGs shown in Figure 1. The role of the different groups and participants essentially play the role of Resource Experts and the sub-award researchers and contractors play the role of Specialty Contractors. Some individuals from these two groups will also play a Proponent Expert role at
Earthquake Engineering Support
1999-11-01
recovered from eql 3c 7 525 49% loose 79% dense 2 5/5/98 1 3 Nevada sand, ESB #2 3d 11 525 54% loose 80% dense 2 3/9/98 2.5 4 Nevada sand, ESB #2 3e...The pore pressure transducers used in the experiments were manufactured by Druck , and are widely used in centrifuge modelling. Typical
NASA Astrophysics Data System (ADS)
Quigley, Mark C.; Hughes, Matthew W.; Bradley, Brendon A.; van Ballegooy, Sjoerd; Reid, Catherine; Morgenroth, Justin; Horton, Travis; Duffy, Brendan; Pettinga, Jarg R.
2016-03-01
Seismic shaking and tectonic deformation during strong earthquakes can trigger widespread environmental effects. The severity and extent of a given effect relates to the characteristics of the causative earthquake and the intrinsic properties of the affected media. Documentation of earthquake environmental effects in well-instrumented, historical earthquakes can enable seismologic triggering thresholds to be estimated across a spectrum of geologic, topographic and hydrologic site conditions, and implemented into seismic hazard assessments, geotechnical engineering designs, palaeoseismic interpretations, and forecasts of the impacts of future earthquakes. The 2010-2011 Canterbury Earthquake Sequence (CES), including the moment magnitude (Mw) 7.1 Darfield earthquake and Mw 6.2, 6.0, 5.9, and 5.8 aftershocks, occurred on a suite of previously unidentified, primarily blind, active faults in the eastern South Island of New Zealand. The CES is one of Earth's best recorded historical earthquake sequences. The location of the CES proximal to and beneath a major urban centre enabled rapid and detailed collection of vast amounts of field, geospatial, geotechnical, hydrologic, biologic, and seismologic data, and allowed incremental and cumulative environmental responses to seismic forcing to be documented throughout a protracted earthquake sequence. The CES caused multiple instances of tectonic surface deformation (≥ 3 events), surface manifestations of liquefaction (≥ 11 events), lateral spreading (≥ 6 events), rockfall (≥ 6 events), cliff collapse (≥ 3 events), subsidence (≥ 4 events), and hydrological (10s of events) and biological shifts (≥ 3 events). The terrestrial area affected by strong shaking (e.g. peak ground acceleration (PGA) ≥ 0.1-0.3 g), and the maximum distances between earthquake rupture and environmental response (Rrup), both generally increased with increased earthquake Mw, but were also influenced by earthquake location and source characteristics. However, the severity of a given environmental response at any given site related predominantly to ground shaking characteristics (PGA, peak ground velocities) and site conditions (water table depth, soil type, geomorphic and topographic setting) rather than earthquake Mw. In most cases, the most severe liquefaction, rockfall, cliff collapse, subsidence, flooding, tree damage, and biologic habitat changes were triggered by proximal, moderate magnitude (Mw ≤ 6.2) earthquakes on blind faults. CES environmental effects will be incompletely preserved in the geologic record and variably diagnostic of spatial and temporal earthquake clustering. Liquefaction feeder dikes in areas of severe and recurrent liquefaction will provide the best preserved and potentially most diagnostic CES features. Rockfall talus deposits and boulders will be well preserved and potentially diagnostic of the strong intensity of CES shaking, but challenging to decipher in terms of single versus multiple events. Most other phenomena will be transient (e.g., distal groundwater responses), not uniquely diagnostic of earthquakes (e.g., flooding), or more ambiguous (e.g. biologic changes). Preliminary palaeoseismic investigations in the CES region indicate recurrence of liquefaction in susceptible sediments of 100 to 300 yr, recurrence of severe rockfall event(s) of ca. 6000 to 8000 yr, and recurrence of surface rupturing on the largest CES source fault of ca. 20,000 to 30,000 yr. These data highlight the importance of utilising multiple proxy datasets in palaeoearthquake studies. The severity of environmental effects triggered during the strongest CES earthquakes was as great as or equivalent to any historic or prehistoric effects recorded in the geologic record. We suggest that the shaking caused by rupture of local blind faults in the CES comprised a 'worst case' seismic shaking scenario for parts of the Christchurch urban area. Moderate Mw blind fault earthquakes may contribute the highest proportion of seismic hazard, be the most important drivers of landscape evolution, and dominate the palaeoseismic record in some locations on Earth, including locations distal from any identified active faults. A high scientific priority should be placed on improving the spatial extent and quality of 'off-fault' shaking records of strong earthquakes, particularly near major urban centres.
Determination of the Attenuation Equation of Strong Motion in the Michoacán State
NASA Astrophysics Data System (ADS)
Vazquez Rosas, R.; Aguirre, J.; Ramirez-Guzman, L.
2014-12-01
Several attenuation relationships have been developed to Mexico, mostly after the September 19, 1985 earthquake which has meant a watershed in the development of Mexican seismological engineering. Since 1985, the number of seismic stations has increased significantly especially between the Coast of Guerrero and Mexico City because of the large amplifications that have acurrect on lake zone and hard ground sites in Mexico City. Some studies have analyzed how the seismic waves are attenuated or amplified from the Pacific coast towards the continent. The attenuation relationship used for seismic hazard assessment in Mexico is due to Ordaz (1989) this was obtained from data from the Guerrero acceleration network. Another recent study is that conducted by (Garcia et al., 2005) with recent data from the Guerrero acceleration network considering intraplate earthquakes. It is important to note that all these relations cover to only part of the Mexican subduction zone, and for some types of seismic sources it may be not suitable to study the earthquake risk in other regions of Mexico. For this work we consider the state of Michoacán, because it has one of the most important seismogenic zones in Mexico. Within the state there are three different kinds of seismic sources: and volcanic tectonic earthquakes and those caused by local faults in the region. Then it is a vital issue to study the seismic wave propagation within the state. We installed a temporary network with 9 accelerographic stations, located at Faro de Brucerías, Aguililla, Apatzingán, Taretán, Uruapan, Nueva Italia Pátzcuaro, Morelia and Maravatío, Michoacán. The stations formed a perpendicular line to the coast, with a total length of 366 km, the distance between stations varies from 60 to 80 km. Among the total seismic events recorded, we selected 7 seismic events located in the Michoacán coastline, from 4.1 to 5.1 Mw. With those records, Q quality factor (107.215 f 0.74) was calculated for frequencies between 0.1 and 10 Hz, since those are the important frequencies for Earthquake Engineering. The preliminary results show a significantly larger attenuation compared with the attenuation laws for the states of Guerrero and Colima.
Lienkaemper, James J.; DeLong, Stephen B.; Domrose, Carolyn J; Rosa, Carla M.
2016-01-01
The M6.0, 24 Aug. 2014 South Napa, California, earthquake exhibited unusually large slip for a California strike-slip event of its size with a maximum coseismic surface slip of 40-50 cm in the north section of the 15 km-long rupture. Although only minor (<10 cm) surface slip occurred coseismically in the southern 9-km section of the rupture, there was considerable postseismic slip, so that the maximum total slip one year after the event approached 40-50 cm, about equal to the coseismic maximum in the north. We measured the accumulation of postseismic surface slip on four, ~100-m-long alignment arrays for one year following the event. Because prolonged afterslip can delay reconstruction of fault-damaged buildings and infrastructure, we analyzed its gradual decay to estimate when significant afterslip would likely end. This forecasting of Napa afterslip suggests how we might approach the scientific and engineering challenges of afterslip from a much larger M~7 earthquake anticipated on the nearby, urban Hayward Fault. However, we expect its afterslip to last much longer than one year.The M6.0, 24 Aug. 2014 South Napa, California, earthquake exhibited unusually large slip for a California strike-slip event of its size with a maximum coseismic surface slip of 40-50 cm in the north section of the 15 km-long rupture. Although only minor (<10 cm) surface slip occurred coseismically in the southern 9-km section of the rupture, there was considerable postseismic slip, so that the maximum total slip one year after the event approached 40-50 cm, about equal to the coseismic maximum in the north. We measured the accumulation of postseismic surface slip on four, ~100-m-long alignment arrays for one year following the event. Because prolonged afterslip can delay reconstruction of fault-damaged buildings and infrastructure, we analyzed its gradual decay to estimate when significant afterslip would likely end. This forecasting of Napa afterslip suggests how we might approach the scientific and engineering challenges of afterslip from a much larger M~7 earthquake anticipated on the nearby, urban Hayward Fault. However, we expect its afterslip to last much longer than one year.
NASA Astrophysics Data System (ADS)
Daniell, J. E.; Khazai, B.; Wenzel, F.; Kunz-Plapp, T.; Vervaeck, A.; Muehr, B.; Markus, M.
2012-04-01
The Van earthquake in 2011 hit at 10:41 GMT (13:41 Local) on Sunday, October 23rd, 2011. It was a Mw7.1-7.3 event located at a depth of around 10 km with the epicentre located directly between Ercis (pop. 75,000) and Van (pop. 370,000). Since then, the CEDIM Forensic Analysis Group (using a team of seismologists, engineers, sociologists and meteorologists) and www.earthquake-report.com has reported and analysed on the Van event. In addition, many damaging aftershocks occurring after the main eventwere analysed including a major aftershock centered in Van-Edremit on November 9th, 2011, causing much additional losses. The province of Van has around 1.035 million people as of the last census. The Van province is one of the poorest in Turkey and has much inequality between the rural and urban centers with an average HDI (Human Development Index) around that of Bhutan or Congo. The earthquakes are estimated to have caused 604 deaths (23 October) and 40 deaths (9 November); mostly due to falling debris and house collapse). In addition, between 1 billion TRY to 4 billion TRY (approx. 555 million USD - 2.2 billion USD) is estimated as total economic losses. This represents around 17 to 66% of the provincial GDP of the Van Province (approx. 3.3 billion USD) as of 2011. From the CATDAT Damaging Earthquakes Database, major earthquakes such as this one have occurred in the year 1111 causing major damage and having a magnitude around 6.5-7. In the year 1646 or 1648, Van was again struck by a M6.7 quake killing around 2000 people. In 1881, a M6.3 earthquake near Van killed 95 people. Again, in 1941, a M5.9 earthquake affected Ercis and Van killing between 190 and 430 people. 1945-1946 as well as 1972 brought again damaging and casualty-bearing earthquakes to the Van province. In 1976, the Van-Muradiye earthquake struck the border region with a M7, killing around 3840 people and causing around 51,000 people to become homeless. Key immediate lessons from similar historic earthquakes in eastern Turkey were developed in terms of the mass shelter and post-earthquake housing needs of the displaced population of Van. This included an analysis of shelter and reconstruction requirements under winter weather conditions; community resourcefulness in coping with housing needs through indigenous methods; and issues with in-place sheltering versus relocation and resettlement. A summary of the losses and implications on the GDP, economic dynamics, capital stock, social structure shelter and housing needs of the region is discussed. In addition, a quick comparison to past similar earthquakes is undertaken through the use of CATDAT.
Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.
2011-01-01
With the advent of the USGS prompt assessment of global earthquakes for response (PAGER) system, which rapidly assesses earthquake impacts, U.S. and international earthquake responders are reconsidering their automatic alert and activation levels and response procedures. To help facilitate rapid and appropriate earthquake response, an Earthquake Impact Scale (EIS) is proposed on the basis of two complementary criteria. On the basis of the estimated cost of damage, one is most suitable for domestic events; the other, on the basis of estimated ranges of fatalities, is generally more appropriate for global events, particularly in developing countries. Simple thresholds, derived from the systematic analysis of past earthquake impact and associated response levels, are quite effective in communicating predicted impact and response needed after an event through alerts of green (little or no impact), yellow (regional impact and response), orange (national-scale impact and response), and red (international response). Corresponding fatality thresholds for yellow, orange, and red alert levels are 1, 100, and 1,000, respectively. For damage impact, yellow, orange, and red thresholds are triggered by estimated losses reaching $1M, $100M, and $1B, respectively. The rationale for a dual approach to earthquake alerting stems from the recognition that relatively high fatalities, injuries, and homelessness predominate in countries in which local building practices typically lend themselves to high collapse and casualty rates, and these impacts lend to prioritization for international response. In contrast, financial and overall societal impacts often trigger the level of response in regions or countries in which prevalent earthquake resistant construction practices greatly reduce building collapse and resulting fatalities. Any newly devised alert, whether economic- or casualty-based, should be intuitive and consistent with established lexicons and procedures. Useful alerts should also be both specific (although allowably uncertain) and actionable. In this analysis, an attempt is made at both simple and intuitive color-coded alerting criteria; yet the necessary uncertainty measures by which one can gauge the likelihood for the alert to be over- or underestimated are preserved. The essence of the proposed impact scale and alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide on the basis of quantifiable loss estimates. Utilizing EIS, PAGER's rapid loss estimates can adequately recommend alert levels and suggest appropriate response protocols, despite the uncertainties; demanding or awaiting observations or loss estimates with a high level of accuracy may increase the losses. ?? 2011 American Society of Civil Engineers.
Earthquake Early Warning and Public Policy: Opportunities and Challenges
NASA Astrophysics Data System (ADS)
Goltz, J. D.; Bourque, L.; Tierney, K.; Riopelle, D.; Shoaf, K.; Seligson, H.; Flores, P.
2003-12-01
Development of an earthquake early warning capability and pilot project were objectives of TriNet, a 5-year (1997-2001) FEMA-funded project to develop a state-of-the-art digital seismic network in southern California. In parallel with research to assemble a protocol for rapid analysis of earthquake data and transmission of a signal by TriNet scientists and engineers, the public policy, communication and educational issues inherent in implementation of an earthquake early warning system were addressed by TriNet's outreach component. These studies included: 1) a survey that identified potential users of an earthquake early warning system and how an earthquake early warning might be used in responding to an event, 2) a review of warning systems and communication issues associated with other natural hazards and how lessons learned might be applied to an alerting system for earthquakes, 3) an analysis of organization, management and public policy issues that must be addressed if a broad-based warning system is to be developed and 4) a plan to provide earthquake early warnings to a small number of organizations in southern California as an experimental prototype. These studies provided needed insights into the social and cultural environment in which this new technology will be introduced, an environment with opportunities to enhance our response capabilities but also an environment with significant barriers to overcome to achieve a system that can be sustained and supported. In this presentation we will address the main public policy issues that were subjects of analysis in these studies. They include a discussion of the possible division of functions among organizations likely to be the principle partners in the management of an earthquake early warning system. Drawing on lessons learned from warning systems for other hazards, we will review the potential impacts of false alarms and missed events on warning system credibility, the acceptability of fully automated warning systems and equity issues associated with possible differential access to warnings. Finally, we will review the status of legal authorities and liabilities faced by organizations that assume various warning system roles and possible approaches to setting up a pilot project to introduce early warning. Our presentation will suggest that introducing an early warning system requires multi-disciplinary and multi-agency cooperation and thoughtful discussion among organizations likely to be providers and participants in an early warning system. Recalling our experience with earthquake prediction, we will look at early warning as a promising but unproven technology and recommend moving forward with caution and patience.
Initial source and site characterization studies for the U.C. Santa Barbara campus
DOE Office of Scientific and Technical Information (OSTI.GOV)
Archuleta, R.; Nicholson, C.; Steidl, J.
1997-12-01
The University of California Campus-Laboratory Collaboration (CLC) project is an integrated 3 year effort involving Lawrence Livermore National Laboratory (LLNL) and four UC campuses - Los Angeles (UCLA), Riverside (UCR), Santa Barbara (UCSB), and San Diego (UCSD) - plus additional collaborators at San Diego State University (SDSU), at Los Alamos National Laboratory and in industry. The primary purpose of the project is to estimate potential ground motions from large earthquakes and to predict site-specific ground motions for one critical structure on each campus. This project thus combines the disciplines of geology, seismology, geodesy, soil dynamics, and earthquake engineering into amore » fully integrated approach. Once completed, the CLC project will provide a template to evaluate other buildings at each of the four UC campuses, as well as provide a methodology for evaluating seismic hazards at other critical sites in California, including other UC locations at risk from large earthquakes. Another important objective of the CLC project is the education of students and other professional in the application of this integrated, multidisciplinary, state-of-the-art approach to the assessment of earthquake hazard. For each campus targeted by the CLC project, the seismic hazard study will consist of four phases: Phase I - Initial source and site characterization, Phase II - Drilling, logging, seismic monitoring, and laboratory dynamic soil testing, Phase III - Modeling of predicted site-specific earthquake ground motions, and Phase IV - Calculations of 3D building response. This report cover Phase I for the UCSB campus and incudes results up through March 1997.« less
Faris, Allison T.; Seed, Raymond B.; Kayen, Robert E.; Wu, Jiaer
2006-01-01
During the 1906 San Francisco Earthquake, liquefaction-induced lateral spreading and resultant ground displacements damaged bridges, buried utilities, and lifelines, conventional structures, and other developed works. This paper presents an improved engineering tool for the prediction of maximum displacement due to liquefaction-induced lateral spreading. A semi-empirical approach is employed, combining mechanistic understanding and data from laboratory testing with data and lessons from full-scale earthquake field case histories. The principle of strain potential index, based primary on correlation of cyclic simple shear laboratory testing results with in-situ Standard Penetration Test (SPT) results, is used as an index to characterized the deformation potential of soils after they liquefy. A Bayesian probabilistic approach is adopted for development of the final predictive model, in order to take fullest advantage of the data available and to deal with the inherent uncertainties intrinstiic to the back-analyses of field case histories. A case history from the 1906 San Francisco Earthquake is utilized to demonstrate the ability of the resultant semi-empirical model to estimate maximum horizontal displacement due to liquefaction-induced lateral spreading.
Ma, Jiaqi; Zhou, Maigeng; Li, Yanfei; Guo, Yan; Su, Xuemei; Qi, Xiaopeng; Ge, Hui
2009-05-01
To describe the design and application of an emergency response mobile phone-based information system for infectious disease reporting. Software engineering and business modeling were used to design and develop the emergency response mobile phone-based information system for infectious disease reporting. Seven days after the initiation of the reporting system, the reporting rate in the earthquake zone reached the level of the same period in 2007, using the mobile phone-based information system. Surveillance of the weekly report on morbidity in the earthquake zone after the initiation of the mobile phone reporting system showed the same trend as the previous three years. The emergency response mobile phone-based information system for infectious disease reporting was an effective solution to transmit urgently needed reports and manage communicable disease surveillance information. This assured the consistency of disease surveillance and facilitated sensitive, accurate, and timely disease surveillance. It is an important backup for the internet-based direct reporting system for communicable disease. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders; ...
2017-09-01
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
NASA Astrophysics Data System (ADS)
Aguiar, Roberto; Rivas-Medina, Alicia; Caiza, Pablo; Quizanga, Diego
2017-03-01
The Metropolitan District of Quito is located on or very close to segments of reverse blind faults, Puengasí, Ilumbisí-La Bota, Carcelen-El Inca, Bellavista-Catequilla and Tangahuilla, making it one of the most seismically dangerous cities in the world. The city is divided into five areas: south, south-central, central, north-central and north. For each of the urban areas, elastic response spectra are presented in this paper, which are determined by utilizing some of the new models of the Pacific Earthquake Engineering Research Center (PEER) NGA-West2 program. These spectra are calculated considering the maximum magnitude that could be generated by the rupture of each fault segment, and taking into account the soil type that exists at different points of the city according to the Norma Ecuatoriana de la Construcción (2015). Subsequently, the recurrence period of earthquakes of high magnitude in each fault segment is determined from the physical parameters of the fault segments (size of the fault plane and slip rate) and the pattern of recurrence of type Gutenberg-Richter earthquakes with double truncation magnitude (Mmin and Mmax) is used.
Seismic and tsunami hazard in Puerto Rico and the Virgin Islands
Dillon, William P.; Frankel, Arthur D.; Mueller, Charles S.; Rodriguez, Rafael W.; ten Brink, Uri S.
1999-01-01
Executive SummaryPuerto Rico and the Virgin Islands are located at an active plate boundary between the North American plate and the northeast corner of the Caribbean plate. The region was subject in historical times to large magnitude earthquakes and devastating tsunamis. A major downward tilt of the sea floor north of Puerto Rico and the Virgin Islands, large submarine rockslides, and an unusually large negative gravity anomaly are also indicative of a tectonically active region. Scientists have so far failed to explain the deformation of this region in a coherent and predictable picture, such as in California, and this has hampered their ability to assess seismic and tsunami hazards in the region. The NE corner of the Caribbean is unique among the seismically-active regions of the United States in that it is mostly covered by water. This fact presents an additional challenge for seismic and tsunami hazard assessment and mitigation.The workshop, convened in San Juan on March 23-24, 1999, was "historic" in that it brought together for the first time a broad spectrum of scientists, engineers, and public and private sector officials who deal with such diverse questions as tectonic models, probabilistic assessment of seismic hazard, prediction of tsunami runup, strong ground motion, building codes, stability of man-made structures, and the public’s preparedness for natural disasters. It was an opportunity for all the participants to find out how their own activity fit into the broad picture of science and how it aids society in hazard assessment and mitigation. In addition, the workshop was offered as a continuing education course at the Colegio de Ingenieros y Agrimensores de Puerto Rico, which assured a rapid dissemination of the results to the local community. A news conference which took place during the workshop alerted the public to the efforts of the USGS, other Federal agencies, the Commonwealth of Puerto Rico, universities and the private sector.During the first day of the workshop, participants from universities, federal institutions, and consulting firms in Puerto Rico, the Virgin Islands, the continental U.S., Dominican Republic, and Europe reviewed the present state of knowledge including a review and discussion of present plate models, recent GPS and seismic reflection data, seismicity, paleoseismology, and tsunamis. The state of earthquake/tsunami studies in Puerto Rico was presented by several faculty members from the University of Puerto Rico at Mayaguez. A preliminary seismic hazard map was presented by the USGS and previous hazard maps and economic loss assessments were considered. During the second day, the participants divided into working groups and prepared specific recommendations for future activities in the region along the six following topics below. Highlights of these recommended activities are:Marine geology and geophysics – Acquire deep-penetration seismic reflection and refraction data, deploy temporary ocean bottom seismometer arrays to record earthquakes, collect high-resolution multibeam bathymetry and side scan sonar data of the region, and in particular, the near shore region, and conduct focussed high-resolution seismic studies around faults. Determine slip rates of specific offshore faults. Assemble a GIS database for available marine geological and geophysical data.Paleoseismology and active faults - Field reconnaissance aimed at identifying Quaternary faults and determining their paleoseismic chronology and slip rates, as well as identifying and dating paleoliquefaction features from large earthquakes. Quaternary mapping of marine terraces, fluvial terraces and basins, beach ridges, etc., to establish framework for understanding neotectonic deformation of the island. Interpretation of aerial photography to identify possible Quaternary faults.Earthquake seismology – Determine an empirical seismic attenuation function using observations from local seismic networks and recently-installed broad-band stations. Evaluate existing earthquake catalogs from local networks and regional stations, complete the catalogs. Transcribe the pre-1991 network data from 9-track tape onto more stable archival media. Calibrate instruments of local networks. Use GPS measurement to constrain deformation rates used in seismic-hazard maps.Engineering – Prepare liquefaction susceptibility maps for the urban areas. Update and improve databases for types of site conditions. Collect site effect observations and near-surface geophysical measurements for future local (urban-area) hazard maps. Expand the number of instruments in the strong motion program. Develop fragility curves for Puerto Rico construction types and details, and carry out laboratory testing on selected types of mass-produced construction. Consider tsunami design in shoreline construction projects.Tsunami hazard - Extract tsunami observations from archives and develop a Caribbean historical tsunami database. Analyze prehistoric tsunami deposits. Collect accurate, up-to-date, near-shore topography and bathymetry for accurate inundation models. Prepare tsunami flooding and evacuation maps. Establish a Caribbean Tsunami Warning System for Puerto Rico and the Virgin Islands. Evaluate local, regional, national, and global seismic networks and equipment, and their role in a tsunami warning system.Societal concerns – Prepare warning messages, protocols, and evacuation routes for earthquake, tsunami, and landslide hazards for Puerto Rico and the U.S. Virgin Islands. Advocate enforcement of existing building codes. Prepare non-technical hazard assessment maps for political and educational uses. Raise the awareness of potentially affected populations by presentations at elementary schools, by the production of a tsunami video, and by distribution of earthquake preparedness manuals in newspaper supplements. Promote partnerships at state and federal level for long-term earthquake and tsunami hazard mitigation. This partnership should also include the private sector such as the insurance industry, telecommunication companies, and the engineering community.The following reports of the various working groups are the cumulative recommendations of the community of scientists, engineers, and public officials, who participated in the workshop. The list of participants and the workshop’s agenda are given in the appendix.Marine and Geology and Geophysics Working GroupPaleoseismology and Active Faults Working GroupJoint Working Group for Earthquake Seismology and EngineeringTsunami Working GroupSocietal Concerns Working Group
APPALACHIAN FOLDS, LATERAL RAMPS, AND BASEMENT FAULTS: A MODERN ENGINEERING PROBLEM?
Pohn, Howard A.
1987-01-01
Field studies and analysis of radar data have shown that cross-strike faulting in the central and southern Appalachians has affected geologic structures at the surface. These basement faults appear to have been active through much of geologic time. Indeed, more than 45 percent of modern earthquakes occur along these narrow zones here termed 'lateral ramps. ' Because of this seismic activity, these lateral ramps are likely to be zones that are prone to slope failure. The engineer should be aware of the presence of such zones and the higher landslide potential along them.
Vulnerability of housing buildings in Bucharest, Romania
NASA Astrophysics Data System (ADS)
Bostenaru, M.
2009-04-01
The author participates to the World Housing Encyclopedia project (www.world-housing.net), an internet based database of housing buildings in earthquake prone areas of the world. This is a voluntary project run by the Earthquake Engineering Research Institute, Oakland, California and the International Association of Earthquake Engineering, financial means being available only for the website where the information is shared. For broader dissemination in 2004 a summary publication of the reports to date was published. The database can be querried for various parameters and browsed after geographic distribution. Participation is open to any housing experts. Between 2003 and 2006 the author was also member of the editorial board. The author contributed numerous reports about building types in Romania, and each one about building types in Germany and Switzerland. This presentation will be about the contributed reports on building types in Romania. To the Encyclopedia eight reports on building types from Bucharest were contributed, while in further research of the author one more was similarly described regarding the vulnerability and the seismic retrofit. The selection of these types was done considering the historic development of the built substance in Bucharest from 1850 on, time from which a representative amount of housing buildings which can be classified in typologies can be found in Bucharest. While the structural types are not necessarily characteristic for the style, since the style has other time limits, often appearing before the type became common and then remaining being practiced also after another style gained ground, a historic succession can be seen also in this case. The nine types considered can be grouped in seven time categories: - the time 1850-1880, for a vernacular housing type with masonry load bearing walls and timber floors, - the time 1880-1920, for the type of two storey or multi-storey house with masonry walls and timber floors (in which stylistically the "national style" flourished), - the time 1920-1940 for the type with reinforced concrete skeleton for gravitational loads only (in which the "interwar style" or Romanian Modernism flourished), - the time immediately after 1940 (when a strong earthquake struck Bucharest), somehow 1940-1947, when the former structural type was continued, but with some improvements, for which a type with reinforced concrete diagonals was considered, - the time 1947-1977, before the strong earthquake from 1977, when cast-in-situ reinforced concrete structural wall buildings were spread. Two types are considered, one which displayed low earthquake vulnerability and one which displayed high earthquake vulnerability, - the time 1977-1989, after the strong earthquake from 1977 and before the fall on the communist regime, when taking as a reason the strong earthquake the regime started to implement another type of buildings, which structurally often were still reinforced concrete structural wall type, but prefabricated, - the time after 1989, when for more flexibility moment resisting frame was built, and also some of the unfinished moment resisting frame buildings were completed. To have such a complete description of all the building type in a country is not common for the World Housing Encyclopedia, and having them for Romania was due to a particular effort of the author. At the same time the database allows finding similar types in other parts of the world. Broadly speaking, each report included two sections, the first one more extended, on the vulnerability of buildings and the second on the seismic retrofit. The reports contain completed check lists, descriptions of the structural system, photographs and drawings. The accent in this presentation will be on the identification of seismic deficiencies and earthquake resilient features, and the connected typical damages, which all describe the vulnerability.
NASA Astrophysics Data System (ADS)
Al-Homoud, A.
2003-04-01
On March 11, 2002, at mid nigh, the Fujairah Masafi region in the UAE was shaken by an earthquake of shallow depth and local magnitude m = 5.1 on Richter Scale. The earthquake occurred on Dibba fault in the UAE with epicenter of the earthquake at 20 km NW of Fujairha city. The focal depth was just 10 km. The earthquake was felt in most parts of the northern emirates: Dubai, Sharjah, Ajman, Ras Al-Khaima, and Um-Qwain. The "main shock" was followed in the following weeks by more than twenty five earthquakes with local magnitude ranging from m = 4 to m = 4.8. The location of those earthquakes was along Zagros Reverse Faulting System in the Iranian side the Arabian Gulf, opposite to the Shores of the UAE. Most of these earthquakes were shallow too and were actually felt by the people. However, there was another strong earthquake in early April 2002 in the same Masafi region with local magnitude m = 5.1 and focal depth 30 km, therefore it was not felt by the northern emirates residents. No major structural damages to buildings and lifeline systems were reported in the several cities located in the vicinity of the earthquake epicenter. The very small values of ground accelerations were not enough to test the structural integrity of tall building and major infrastructures. Future major earthquakes anticipated in the region in close vicinity of northern emirates, once they occur, and considering the noticeable local site effect of the emirates sandy soils of high water table levels, will surely put these newly constructed building into the real test. Even though there were no casualties in the March 11th event, but there was major fear as a result of the loud sound of rock rupture heard in the mountains close to Maafi, the noticeable disturbance of animals and birds minutes before the earthquake incident and during the incident, cracks in the a good number of Masafi buildings and major damages that occurred in "old" buildings of Fujairah Masafi area, the closest city to the epicenter of the earthquake. Indeed, the March 11, 2002 and "aftershocks" scared the citizens of Masafi and surrounding regions and ignited the attention of the public and government to the subject matter of earthquake hazard, specialty this earthquake came one year after the near by Indian m = 6.5 destructive Earthquake. Indeed the recent m = 6.2 June 22 destructive earthquake too that hit north west Iran, has again reminded the UAE public and government with the need to take quick and concrete measures to dtake the necessary steps to mitigate any anticipated earthquake hazard. This study reflects in some details on the following aspects related to the region and vicinity: geological and tectonic setting, seismicity, earthquake activity data base and seismic hazard assessment. Moreover, it documents the following aspects of the March 11, 2002 earthquake: tectonic, seismological, instrumental seismic data, aftershocks, strong motion recordings and response spectral and local site effect analysis, geotechnical effects and structural observations in the region affected by the earthquake. The study identifies local site ground amplification effects and liquefaction hazard potential in some parts of the UAE. Moreover, the study reflects on the coverage of the incident in the media, public and government response, state of earthquake engineering practice in the construction industry in the UAE, and the national preparedness and public awareness issues. However, it is concluded for this event that the mild damages that occurred in Masafi region were due to poor quality of construction, and lack of underestimating of the design base shear. Practical recommendations are suggested for the authorities to avoid damages in newly constructed buildings and lifelines as a result of future stronger earthquakes, in addition to recommendations on a national strategy for earthquake hazard mitigation in the UAE, which is still missing. The recommendations include the development and implementation of a design code for earthquake loading in the UAE, development of macro and micro seismic hazard maps, development of local site effect and liquefaction hazard maps, installation of a national earthquake monitoring network, assessment of the vulnerability of critical structures and life line facilities, public awareness, training of rescue teams in civil defense, etc.
Wald, David J.; Lin, Kuo-wan; Kircher, C.A.; Jaiswal, Kishor; Luco, Nicolas; Turner, L.; Slosky, Daniel
2017-01-01
The ShakeCast system is an openly available, near real-time post-earthquake information management system. ShakeCast is widely used by public and private emergency planners and responders, lifeline utility operators and transportation engineers to automatically receive and process ShakeMap products for situational awareness, inspection priority, or damage assessment of their own infrastructure or building portfolios. The success of ShakeCast to date and its broad, critical-user base mandates improved software usability and functionality, including improved engineering-based damage and loss functions. In order to make the software more accessible to novice users—while still utilizing advanced users’ technical and engineering background—we have developed a “ShakeCast Workbook”, a well documented, Excel spreadsheet-based user interface that allows users to input notification and inventory data and export XML files requisite for operating the ShakeCast system. Users will be able to select structure based on a minimum set of user-specified facility (building location, size, height, use, construction age, etc.). “Expert” users will be able to import user-modified structural response properties into facility inventory associated with the HAZUS Advanced Engineering Building Modules (AEBM). The goal of the ShakeCast system is to provide simplified real-time potential impact and inspection metrics (i.e., green, yellow, orange and red priority ratings) to allow users to institute customized earthquake response protocols. Previously, fragilities were approximated using individual ShakeMap intensity measures (IMs, specifically PGA and 0.3 and 1s spectral accelerations) for each facility but we are now performing capacity-spectrum damage state calculations using a more robust characterization of spectral deamnd.We are also developing methods for the direct import of ShakeMap’s multi-period spectra in lieu of the assumed three-domain design spectrum (at 0.3s for constant acceleration; 1s or 3s for constant velocity and constant displacement at very long response periods). As part of ongoing ShakeCast research and development, we will also explore the use of ShakeMap IM uncertainty estimates and evaluate the assumption of employing multiple response spectral damping values rather than the single 5%-damped value currently employed. Developing and incorporating advanced fragility assignments into the ShakeCast Workbook requires related software modifications and database improvements; these enhancements are part of an extensive rewrite of the ShakeCast application.
Engineering and Design: Structural Deformation Surveying
2002-06-01
loading deformations. Long-term measurements are far more common and somewhat more complex given their external nature . Long-term monitoring of a...fitting of structural elements, environmental protection, and development of mitigative measures in the case of natural disasters (land slides, earthquakes...of additional localized monitoring points (i.e., points not intended for routine observation) to determine the nature and extent of large displacements
2007-10-10
Dipartimento di Meccanica Strutturale, Università degli Studi di Pavia cDipartimento di Matematica , Università degli Studi di Pavia dEuropean Centre...for Training and Research in Earthquake Engineering, Pavia eIstituto di Matematica Applicata e Tecnologie Informatiche del CNR, Pavia “Comparisons
1980-01-01
unverified listings were acquired from an unpublished map (1:250,000) and report compiled 1-31 by the Geological Survey of Alabama ( Self and others...studies. These parameters, when properly modified to al: .c:at for the dynamic characteristics of the structure, define the ...vn:1:,c respose of an
Shotcrete for Expedient Structural Repair
1991-12-01
pp. 29-44. Selmer - Olsen , R., "Examples of the Behavior of Shotcrete Linings Underground," Proceedings, Shotcrete for Ground Support, The Engineering... Selmer - Olsen , R. PAPER TITLE: Examples of the Behavior of Shotcrete Linings Underground DESCRIPTIVE TITLE NOTE: -0- BOOK/REPORT TITLE: Proceedings...prestressed tanks, thin overlays over structural materials, repair of concrete deteriorated by fire or earthquake, rock slope stabilization, and
Reducing Disaster Vulnerability Through Science and Technology
2003-07-01
engineering design. Source: “Massive Alaska Earthquake Rocks the Mainland,” Volcano Watch, Hawaiian Volcano Observatory, November 14, 2002, http... volcanoes , and landslides ■ Disease epidemics ■ Technological disasters, including critical infrastructure threats, oil and chemical spills, and building...risk reduction can enhance protection of buildings even in these high-risk areas. Volcanoes The United States is among the most volcanically active
2000-07-01
Engineering bench system hardware for the Mechanics of Granular Materials (MGM) experiment is tested on a lab bench at the University of Colorado in Boulder. This is done in a horizontal arrangement to reduce pressure differences so the tests more closely resemble behavior in the microgravity of space. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. MGM experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: University of Colorado at Boulder).
Estimating structural collapse fragility of generic building typologies using expert judgment
Jaiswal, Kishor; Wald, David J.; Perkins, David M.; Aspinall, Willy P.; Kiremidjian, Anne S.
2014-01-01
The structured expert elicitation process proposed by Cooke (1991), hereafter referred to as Cooke's approach, is applied for the first time in the realm of structural collapse-fragility assessment for selected generic construction types. Cooke's approach works on the principle of objective calibration scoring of judgments couple with hypothesis testing used in classical statistics. The performance-based scoring system reflects the combined measure of an expert's informativeness about variables in the problem are under consideration, and their ability to enumerate, in a statistically accurate way through expressing their true beliefs, the quantitative uncertainties associated with their assessments. We summarize the findings of an expert elicitation workshop in which a dozen earthquake-engineering professionals from around the world were engaged to estimate seismic collapse fragility for generic construction types. Development of seismic collapse fragility-functions was accomplished by combining their judgments using weights derived from Cooke's method. Although substantial effort was needed to elicit the inputs of these experts successfully, we anticipate that the elicitation strategy described here will gain momentum in a wide variety of earthquake seismology and engineering hazard and risk analyses where physical model and data limitations are inherent and objective professional judgment can fill gaps.
Estimating structural collapse fragility of generic building typologies using expert judgment
Jaiswal, Kishor S.; Wald, D.J.; Perkins, D.; Aspinall, W.P.; Kiremidjian, Anne S.; Deodatis, George; Ellingwood, Bruce R.; Frangopol, Dan M.
2014-01-01
The structured expert elicitation process proposed by Cooke (1991), hereafter referred to as Cooke’s approach, is applied for the first time in the realm of structural collapse-fragility assessment for selected generic construction types. Cooke’s approach works on the principle of objective calibration scoring of judgments coupled with hypothesis testing used in classical statistics. The performance-based scoring system reflects the combined measure of an expert’s informativeness about variables in the problem area under consideration, and their ability to enumerate, in a statistically accurate way through expressing their true beliefs, the quantitative uncertainties associated with their assessments. We summarize the findings of an expert elicitation workshop in which a dozen earthquake-engineering professionals from around the world were engaged to estimate seismic collapse fragility for generic construction types. Development of seismic collapse fragility functions was accomplished by combining their judgments using weights derived from Cooke’s method. Although substantial effort was needed to elicit the inputs of these experts successfully, we anticipate that the elicitation strategy described here will gain momentum in a wide variety of earthquake seismology and engineering hazard and risk analyses where physical model and data limitations are inherent and objective professional judgment can fill gaps.
2000-07-01
What appear to be boulders fresh from a tumble down a mountain are really grains of Ottawa sand, a standard material used in civil engineering tests and also used in the Mechanics of Granular Materials (MGM) experiment. The craggy surface shows how sand grans have faces that can cause friction as they roll and slide against each other, or even causing sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. MGM uses the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. These images are from an Electron Spectroscopy for Chemical Analysis (ESCA) study conducted by Dr. Binayak Panda of IITRI for Marshall Space Flight Center (MSFC). (Credit: NASA/MSFC)
Mechanics of Granular Materials (MGM0 Flight Hardware in Bench Test
NASA Technical Reports Server (NTRS)
2000-01-01
Engineering bench system hardware for the Mechanics of Granular Materials (MGM) experiment is tested on a lab bench at the University of Colorado in Boulder. This is done in a horizontal arrangement to reduce pressure differences so the tests more closely resemble behavior in the microgravity of space. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. MGM experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: University of Colorado at Boulder).
Hilbert-Huang transform analysis of dynamic and earthquake motion recordings
Zhang, R.R.; Ma, S.; Safak, E.; Hartzell, S.
2003-01-01
This study examines the rationale of Hilbert-Huang transform (HHT) for analyzing dynamic and earthquake motion recordings in studies of seismology and engineering. In particular, this paper first provides the fundamentals of the HHT method, which consist of the empirical mode decomposition (EMD) and the Hilbert spectral analysis. It then uses the HHT to analyze recordings of hypothetical and real wave motion, the results of which are compared with the results obtained by the Fourier data processing technique. The analysis of the two recordings indicates that the HHT method is able to extract some motion characteristics useful in studies of seismology and engineering, which might not be exposed effectively and efficiently by Fourier data processing technique. Specifically, the study indicates that the decomposed components in EMD of HHT, namely, the intrinsic mode function (IMF) components, contain observable, physical information inherent to the original data. It also shows that the grouped IMF components, namely, the EMD-based low- and high-frequency components, can faithfully capture low-frequency pulse-like as well as high-frequency wave signals. Finally, the study illustrates that the HHT-based Hilbert spectra are able to reveal the temporal-frequency energy distribution for motion recordings precisely and clearly.
An Evaluation of Infrastructure for Tsunami Evacuation in Padang, West Sumatra, Indonesia (Invited)
NASA Astrophysics Data System (ADS)
Cedillos, V.; Canney, N.; Deierlein, G.; Diposaptono, S.; Geist, E. L.; Henderson, S.; Ismail, F.; Jachowski, N.; McAdoo, B. G.; Muhari, A.; Natawidjaja, D. H.; Sieh, K. E.; Toth, J.; Tucker, B. E.; Wood, K.
2009-12-01
Padang has one of the world’s highest tsunami risks due to its high hazard, vulnerable terrain and population density. The current strategy to prepare for tsunamis in Padang is focused on developing early warning systems, planning evacuation routes, conducting evacuation drills, and raising local awareness. Although these are all necessary, they are insufficient. Padang’s proximity to the Sunda Trench and flat terrain make reaching safe ground impossible for much of the population. The natural warning in Padang - a strong earthquake that lasts over a minute - will be the first indicator of a potential tsunami. People will have about 30 minutes after the earthquake to reach safe ground. It is estimated that roughly 50,000 people in Padang will be unable to evacuate in that time. Given these conditions, other means to prepare for the expected tsunami must be developed. With this motivation, GeoHazards International and Stanford University’s Chapter of Engineers for a Sustainable World partnered with Indonesian organizations - Andalas University and Tsunami Alert Community in Padang, Laboratory for Earth Hazards, and the Ministry of Marine Affairs and Fisheries - in an effort to evaluate the need for and feasibility of tsunami evacuation infrastructure in Padang. Tsunami evacuation infrastructure can include earthquake-resistant bridges and evacuation structures that rise above the maximum tsunami water level, and can withstand the expected earthquake and tsunami forces. The choices for evacuation structures vary widely - new and existing buildings, evacuation towers, soil berms, elevated highways and pedestrian overpasses. This interdisciplinary project conducted a course at Stanford University, undertook several field investigations, and concluded that: (1) tsunami evacuation structures and bridges are essential to protect the people in Padang, (2) there is a need for a more thorough engineering-based evaluation than conducted to-date of the suitability of existing buildings to serve as evacuation structures, and of existing bridges to serve as elements of evacuation routes, and (3) additions to Padang’s tsunami evacuation infrastructure must carefully take into account technical matters (e.g. expected wave height, debris impact forces), social considerations (e.g. cultural acceptability, public’s confidence in the structure’s integrity), and political issues (e.g. land availability, cost, maintenance). Future plans include collaboration between U.S. and Indonesian engineers in developing designs for new tsunami evacuation structures, as well as providing training for Indonesian authorities on: (1) siting, designing, and constructing tsunami evacuation structures, and (2) evaluating the suitability of existing buildings to serve as tsunami evacuation shelters.
Seismic hazard and risk assessment for large Romanian dams situated in the Moldavian Platform
NASA Astrophysics Data System (ADS)
Moldovan, Iren-Adelina; Popescu, Emilia; Otilia Placinta, Anica; Petruta Constantin, Angela; Toma Danila, Dragos; Borleanu, Felix; Emilian Toader, Victorin; Moldoveanu, Traian
2016-04-01
Besides periodical technical inspections, the monitoring and the surveillance of dams' related structures and infrastructures, there are some more seismic specific requirements towards dams' safety. The most important one is the seismic risk assessment that can be accomplished by rating the dams into seismic risk classes using the theory of Bureau and Ballentine (2002), and Bureau (2003), taking into account the maximum expected peak ground motions at the dams site - values obtained using probabilistic hazard assessment approaches (Moldovan et al., 2008), the structures vulnerability and the downstream risk characteristics (human, economical, historic and cultural heritage, etc) in the areas that might be flooded in the case of a dam failure. Probabilistic seismic hazard (PSH), vulnerability and risk studies for dams situated in the Moldavian Platform, starting from Izvorul Muntelui Dam, down on Bistrita and following on Siret River and theirs affluent will be realized. The most vulnerable dams will be studied in detail and flooding maps will be drawn to find the most exposed downstream localities both for risk assessment studies and warnings. GIS maps that clearly indicate areas that are potentially flooded are enough for these studies, thus giving information on the number of inhabitants and goods that may be destroyed. Geospatial servers included topography is sufficient to achieve them, all other further studies are not necessary for downstream risk assessment. The results will consist of local and regional seismic information, dams specific characteristics and locations, seismic hazard maps and risk classes, for all dams sites (for more than 30 dams), inundation maps (for the most vulnerable dams from the region) and possible affected localities. The studies realized in this paper have as final goal to provide the local emergency services with warnings of a potential dam failure and ensuing flood as a result of an large earthquake occurrence, allowing further public training for evacuation. The work is supported from PNII/PCCA 2013 Project DARING 69/2014, financed by UEFISCDI, Romania. Bureau GJ (2003) "Dams and appurtenant facilities" Earthquake Engineering Handbook, CRS Press, WF Chen, and C Scawthorn (eds.), Boca Raton, pp. 26.1-26.47. Bureau GJ and Ballentine GD (2002) "A comprehensive seismic vulnerability and loss assessment of the State of Carolina using HAZUS. Part IV: Dam inventory and vulnerability assessment methodology", 7th National Conference on Earthquake Engineering, July 21-25, Boston, Earthquake Engineering Research Institute, Oakland, CA. Moldovan IA, Popescu E, Constantin A (2008), "Probabilistic seismic hazard assessment in Romania: application for crustal seismic active zones", Romanian Journal of Physics, Vol.53, Nos. 3-4
Seismic damage identification using multi-line distributed fiber optic sensor system
NASA Astrophysics Data System (ADS)
Ou, Jinping; Hou, Shuang
2005-06-01
Determination of the actual nonlinear inelastic response mechanisms developed by civil structures such as buildings and bridges during strong earthquakes and post-earthquake damage assessment of these structures represent very difficult challenges for earthquake structural engineers. One of the main reasons is that the traditional sensor can't serve for such a long period to cover an earthquake and the seismic damage location in the structure can't be predicted in advance definitely. It is thought that the seismic damage of reinforced concrete (RC) structure can be related to the maximum response the structure, which can also be related to the cracks on the concrete. A distributed fiber optic sensor was developed to detect the cracks on the reinforced concrete structure under load. Fiber optic couples were used in the sensor system to extend the sensor system's capacity from one random point detection to more. An optical time domain reflectometer (OTDR) is employed for interrogation of the sensor signal. Fiber optic sensors are attached on the surface of the concrete by the epoxy glue. By choosing the strength of epoxy, the damage state of the concrete can be responded to the occurrence of the Fresnel scattering in the fiber optic sensor. Experiments involved monotonic loading to failure. Finally, the experimental results in terms of crack detection capability are presented and discussed.
NASA Astrophysics Data System (ADS)
Liu, Bo-Yan; Shi, Bao-Ping; Zhang, Jian
2007-05-01
In this study, a composite source model has been used to calculate the realistic strong ground motions in Beijing area, caused by 1679 M S8.0 earthquake in Sanhe-Pinggu. The results could provide us the useful physical parameters for the future seismic hazard analysis in this area. Considering the regional geological/geophysical background, we simulated the scenario earthquake with an associated ground motions in the area ranging from 39.3°N to 41.1°N in latitude and from 115.35°E to 117.55°E in longitude. Some of the key factors which could influence the characteristics of strong ground motion have been discussed, and the resultant peak ground acceleration (PGA) distribution and the peak ground velocity (PGV) distribution around Beijing area also have been made as well. A comparison of the simulated result with the results derived from the attenuation relation has been made, and a sufficient discussion about the advantages and disadvantages of composite source model also has been given in this study. The numerical results, such as the PGA, PGV, peak ground displacement (PGD), and the three-component time-histories developed for Beijing area, have a potential application in earthquake engineering field and building code design, especially for the evaluation of critical constructions, government decision making and the seismic hazard assessment by financial/insurance companies.
A method for producing digital probabilistic seismic landslide hazard maps
Jibson, R.W.; Harp, E.L.; Michael, J.A.
2000-01-01
The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include: (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24 000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10 m grid spacing using ARC/INFO GIS software on a UNIX computer. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure. ?? 2000 Elsevier Science B.V. All rights reserved.
Jibson, Randall W.; Harp, Edwin L.; Michael, John A.
1998-01-01
The 1994 Northridge, California, earthquake is the first earthquake for which we have all of the data sets needed to conduct a rigorous regional analysis of seismic slope instability. These data sets include (1) a comprehensive inventory of triggered landslides, (2) about 200 strong-motion records of the mainshock, (3) 1:24,000-scale geologic mapping of the region, (4) extensive data on engineering properties of geologic units, and (5) high-resolution digital elevation models of the topography. All of these data sets have been digitized and rasterized at 10-m grid spacing in the ARC/INFO GIS platform. Combining these data sets in a dynamic model based on Newmark's permanent-deformation (sliding-block) analysis yields estimates of coseismic landslide displacement in each grid cell from the Northridge earthquake. The modeled displacements are then compared with the digital inventory of landslides triggered by the Northridge earthquake to construct a probability curve relating predicted displacement to probability of failure. This probability function can be applied to predict and map the spatial variability in failure probability in any ground-shaking conditions of interest. We anticipate that this mapping procedure will be used to construct seismic landslide hazard maps that will assist in emergency preparedness planning and in making rational decisions regarding development and construction in areas susceptible to seismic slope failure.
U.S. Geological Survey's ShakeCast: A cloud-based future
Wald, David J.; Lin, Kuo-Wan; Turner, Loren; Bekiri, Nebi
2014-01-01
When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap portrays the extent of potentially damaging shaking. In turn, the ShakeCast system, a freely-available, post-earthquake situational awareness application, automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. ShakeCast is particularly suitable for earthquake planning and response purposes by Departments of Transportation (DOTs), critical facility and lifeline utilities, large businesses, engineering and financial services, and loss and risk modelers. Recent important developments to the ShakeCast system and its user base are described. The newly-released Version 3 of the ShakeCast system encompasses advancements in seismology, earthquake engineering, and information technology applicable to the legacy ShakeCast installation (Version 2). In particular, this upgrade includes a full statistical fragility analysis framework for general assessment of structures as part of the near real-time system, direct access to additional earthquake-specific USGS products besides ShakeMap (PAGER, DYFI?, tectonic summary, etc.), significant improvements in the graphical user interface, including a console view for operations centers, and custom, user-defined hazard and loss modules. The release also introduces a new adaption option to port ShakeCast to the "cloud". Employing Amazon Web Services (AWS), users now have a low-cost alternative to local hosting, by fully offloading hardware, software, and communication obligations to the cloud. Other advantages of the "ShakeCast Cloud" strategy include (1) Reliability and robustness of offsite operations, (2) Scalability naturally accommodated, (3), Serviceability, problems reduced due to software and hardware uniformity, (4) Testability, freely available for new users, (5) Remotely supported, allowing expert-facilitated maintenance, (6) Adoptability, simplified with disk images, and (7) Security, built in at the very high level associated with AWS. The ShakeCast user base continues to expand and broaden. For example, Caltrans, the prototypical ShakeCast user and development supporter, has been providing guidance to other DOTs on the use of the National Bridge Inventory (NBI) database to implement fully-functional ShakeCast systems in their states. A long-term goal underway is to further "connect the DOTs" via a Transportation Pooled Fund (TPF) with participating state DOTs. We also review some of the many other users and uses of ShakeCast. Lastly, on the hazard input front, we detail related ShakeMap improvements and ongoing advancements in estimating the likelihood of shaking-induced secondary hazards at structures, facilities, bridges, and along roadways due to landslides and liquefaction, and implemented within the ShakeCast framework.
Nonlinear Site Response Validation Studies Using KIK-net Strong Motion Data
NASA Astrophysics Data System (ADS)
Asimaki, D.; Shi, J.
2014-12-01
Earthquake simulations are nowadays producing realistic ground motion time-series in the range of engineering design applications. Of particular significance to engineers are simulations of near-field motions and large magnitude events, for which observations are scarce. With the engineering community slowly adopting the use of simulated ground motions, site response models need to be re-evaluated in terms of their capabilities and limitations to 'translate' the simulated time-series from rock surface output to structural analyses input. In this talk, we evaluate three one-dimensional site response models: linear viscoelastic, equivalent linear and nonlinear. We evaluate the performance of the models by comparing predictions to observations at 30 downhole stations of the Japanese network KIK-Net that have recorded several strong events, including the 2011 Tohoku earthquake. Velocity profiles are used as the only input to all models, while additional parameters such as quality factor, density and nonlinear dynamic soil properties are estimated from empirical correlations. We quantify the differences of ground surface predictions and observations in terms of both seismological and engineering intensity measures, including bias ratios of peak ground response and visual comparisons of elastic spectra, and inelastic to elastic deformation ratio for multiple ductility ratios. We observe that PGV/Vs,30 — as measure of strain— is a better predictor of site nonlinearity than PGA, and that incremental nonlinear analyses are necessary to produce reliable estimates of high-frequency ground motion components at soft sites. We finally discuss the implications of our findings on the parameterization of nonlinear amplification factors in GMPEs, and on the extensive use of equivalent linear analyses in probabilistic seismic hazard procedures.
Petersen, Mark D.; Mueller, Charles; Moschetti, Morgan P.; Hoover, Susan M.; Llenos, Andrea L.; Ellsworth, William L.; Michael, Andrew J.; Rubinstein, Justin L.; McGarr, Arthur F.; Rukstales, Kenneth S.
2016-01-01
The U.S. Geological Survey (USGS) has produced a one‐year (2016) probabilistic seismic‐hazard assessment for the central and eastern United States (CEUS) that includes contributions from both induced and natural earthquakes that are constructed with probabilistic methods using alternative data and inputs. This hazard assessment builds on our 2016 final model (Petersen et al., 2016) by adding sensitivity studies, illustrating hazard in new ways, incorporating new population data, and discussing potential improvements. The model considers short‐term seismic activity rates (primarily 2014–2015) and assumes that the activity rates will remain stationary over short time intervals. The final model considers different ways of categorizing induced and natural earthquakes by incorporating two equally weighted earthquake rate submodels that are composed of alternative earthquake inputs for catalog duration, smoothing parameters, maximum magnitudes, and ground‐motion models. These alternatives represent uncertainties on how we calculate earthquake occurrence and the diversity of opinion within the science community. In this article, we also test sensitivity to the minimum moment magnitude between M 4 and M 4.7 and the choice of applying a declustered catalog with b=1.0 rather than the full catalog with b=1.3. We incorporate two earthquake rate submodels: in the informed submodel we classify earthquakes as induced or natural, and in the adaptive submodel we do not differentiate. The alternative submodel hazard maps both depict high hazard and these are combined in the final model. Results depict several ground‐shaking measures as well as intensity and include maps showing a high‐hazard level (1% probability of exceedance in 1 year or greater). Ground motions reach 0.6g horizontal peak ground acceleration (PGA) in north‐central Oklahoma and southern Kansas, and about 0.2g PGA in the Raton basin of Colorado and New Mexico, in central Arkansas, and in north‐central Texas near Dallas–Fort Worth. The chance of having levels of ground motions corresponding to modified Mercalli intensity (MMI) VI or greater earthquake shaking is 2%–12% per year in north‐central Oklahoma and southern Kansas and New Madrid similar to the chance of damage at sites in high‐hazard portions of California caused by natural earthquakes. Hazard is also significant in the Raton basin of Colorado/New Mexico; north‐central Arkansas; Dallas–Fort Worth, Texas; and in a few other areas. Hazard probabilities are much lower (by about half or more) for exceeding MMI VII or VIII. Hazard is 3‐ to 10‐fold higher near some areas of active‐induced earthquakes than in the 2014 USGS National Seismic Hazard Model (NSHM), which did not consider induced earthquakes. This study in conjunction with the LandScan TM Database (2013) indicates that about 8 million people live in areas of active injection wells that have a greater than 1% chance of experiencing damaging ground shaking (MMI≥VI) in 2016. The final model has high uncertainty, and engineers, regulators, and industry should use these assessments cautiously to make informed decisions on mitigating the potential effects of induced and natural earthquakes.
NASA Astrophysics Data System (ADS)
Li, Linlin; Switzer, Adam D.; Wang, Yu; Chan, Chung-Han; Qiu, Qiang; Weiss, Robert
2017-04-01
Current tsunami inundation maps are commonly generated using deterministic scenarios, either for real-time forecasting or based on hypothetical "worst-case" events. Such maps are mainly used for emergency response and evacuation planning and do not include the information of return period. However, in practice, probabilistic tsunami inundation maps are required in a wide variety of applications, such as land-use planning, engineer design and for insurance purposes. In this study, we present a method to develop the probabilistic tsunami inundation map using a stochastic earthquake source model. To demonstrate the methodology, we take Macau a coastal city in the South China Sea as an example. Two major advances of this method are: it incorporates the most updated information of seismic tsunamigenic sources along the Manila megathrust; it integrates a stochastic source model into a Monte Carlo-type simulation in which a broad range of slip distribution patterns are generated for large numbers of synthetic earthquake events. When aggregated the large amount of inundation simulation results, we analyze the uncertainties associated with variability of earthquake rupture location and slip distribution. We also explore how tsunami hazard evolves in Macau in the context of sea level rise. Our results suggest Macau faces moderate tsunami risk due to its low-lying elevation, extensive land reclamation, high coastal population and major infrastructure density. Macau consists of four districts: Macau Peninsula, Taipa Island, Coloane island and Cotai strip. Of these Macau Peninsula is the most vulnerable to tsunami due to its low-elevation and exposure to direct waves and refracted waves from the offshore region and reflected waves from mainland. Earthquakes with magnitude larger than Mw8.0 in the northern Manila trench would likely cause hazardous inundation in Macau. Using a stochastic source model, we are able to derive a spread of potential tsunami impacts for earthquakes with the same magnitude. The diversity is caused by both random rupture locations and heterogeneous slip distribution. Adding the sea level rise component, the inundated depth caused by 1 m sea level rise is equivalent to the one caused by 90 percentile of an ensemble of Mw8.4 earthquakes.
Liu, Ya-hua; Yang, Hui-ning; Liu, Hui-liang; Wang, Fan; Hu, Li-bin; Zheng, Jing-chen
2013-05-01
To summarize and analyze the medical mission of China National Earthquake Disaster Emergency Search and Rescue Team (CNESAR) in Lushan earthquake, to promote the medical rescue effectiveness incorporated with search and rescue. Retrospective analysis of medical work data by CNESAR from April 21th, 2013 to April 27th during Lushan earthquake rescue, including the medical staff dispatch and the wounded case been treated. The reasonable medical corps was composed by 22 members, including 2 administrators, 11 doctors [covering emergency medicine, orthopedics (joints and limbs, spinal), obstetrics and gynecology, gastroenterology, cardiology, ophthalmology, anesthesiology, medical rescue, health epidemic prevention, clinical laboratory of 11 specialties], 1 ultrasound technician, 5 nurses, 1 pharmacist, 1 medical instrument engineer and 1 office worker for propaganda. There were two members having psychological consultants qualifications. The medical work were carried out in seven aspects, including medical care assurance for the CNESAR members, first aid cooperation with search and rescue on site, clinical work in refugees' camp, medical round service for scattered village people, evacuation for the wounded, mental intervention, and the sanitary and anti-epidemic work. The medical work covered 24 small towns, and medical staff established 3 medical clinics at Taiping Town, Shuangshi Town of Lushan County and Baoxing County. Medical rescue, mental intervention for the old and kids, and sanitary and anti-epidemic were performed at the above sites. The medical corps had successful evacuated 2 severe wounded patients and treated the wounded over thousands. Most of the wounded were soft tissue injuries, external injury, respiratory tract infections, diarrhea, and heat stroke. Compared with the rescue action in 2008 Wenchuan earthquake, the aggregation and departure of rescue team in Lushan earthquake, the traffic control order in disaster area, the self-aid and buddy aid are better, which give rise to the casualties to the lowest. The medical mission incorporated with search and rescue work showed that the medical performance manner altered with stages, the medical staff match changed with the mission, and the focus related with rescue time.
NASA Astrophysics Data System (ADS)
Wilson, R. I.; Ramirez-Herrera, M. T.; Dengler, L. A.; Miller, K.; LaDuke, Y.
2017-12-01
The preliminary tsunami impacts from the September 7, 2017, M8.1 Tehuantepec Earthquake have been summarized in the following report: https://www.eeri.org/wp-content/uploads/EERI-Recon-Rpt-090717-Mexico-tsunami_fn.pdf. Although the tsunami impacts were not as significant as those from the earthquake itself (98 fatalities and 41,000 homes damaged), the following are highlights and lessons learned: The Tehuantepec earthquake was one of the largest down-slab normal faulting events ever recorded. This situation complicated the tsunami forecast since forecast methods and pre-event modeling are primarily associated with megathrust earthquakes where the most significant tsunamis are generated. Adding non-megathrust source modeling to the tsunami forecast databases of conventional warning systems should be considered. Offshore seismic and tsunami hazard analyses using past events should incorporate the potential for large earthquakes occurring along sources other than the megathrust boundary. From an engineering perspective, initial reports indicate there was only minor tsunami damage along the Mexico coast. There was damage to Marina Chiapas where floating docks overtopped their piles. Increasing pile heights could reduce the potential for damage to floating docks. Tsunami warning notifications did not get to the public in time to assist with evacuation. Streamlining the messaging in Mexico from the warning system directly to the public should be considered. And, for local events, preparedness efforts should place emphasis on responding to feeling the earthquake and not waiting to be notified. Although the U.S. tsunami warning centers were timely with their international and domestic messaging, there were some issues with how those messages were presented and interpreted. The use of a "Tsunami Threat" banner on the new main warning center website created confusion with emergency managers in the U.S. where no tsunami threat was expected to exist. Also, some U.S. states and territories in the Pacific were listed in both domestic and international messages, which caused confusion for American Samoa where these messages contained somewhat conflicting information. These issues are being addressed by the warning centers and the U.S. National Tsunami Hazard Mitigation Program.
IT Tools for Teachers and Scientists, Created by Undergraduate Researchers
NASA Astrophysics Data System (ADS)
Millar, A. Z.; Perry, S.
2007-12-01
Interns in the Southern California Earthquake Center/Undergraduate Studies in Earthquake Information Technology (SCEC/UseIT) program conduct computer science research for the benefit of earthquake scientists and have created products in growing use within the SCEC education and research communities. SCEC/UseIT comprises some twenty undergraduates who combine their varied talents and academic backgrounds to achieve a Grand Challenge that is formulated around needs of SCEC scientists and educators and that reflects the value SCEC places on the integration of computer science and the geosciences. In meeting the challenge, students learn to work on multidisciplinary teams and to tackle complex problems with no guaranteed solutions. Meantime, their efforts bring fresh perspectives and insight to the professionals with whom they collaborate, and consistently produces innovative, useful tools for research and education. The 2007 Grand Challenge was to design and prototype serious games to communicate important earthquake science concepts. Interns broke themselves into four game teams, the Educational Game, the Training Game, the Mitigation Game and the Decision-Making Game, and created four diverse games with topics from elementary plate tectonics to earthquake risk mitigation, with intended players ranging from elementary students to city planners. The games were designed to be versatile, to accommodate variation in the knowledge base of the player; and extensible, to accommodate future additions. The games are played on a web browser or from within SCEC-VDO (Virtual Display of Objects). SCEC-VDO, also engineered by UseIT interns, is a 4D, interactive, visualization software that enables integration and exploration of datasets and models such as faults, earthquake hypocenters and ruptures, digital elevation models, satellite imagery, global isochrons, and earthquake prediction schemes. SCEC-VDO enables the user to create animated movies during a session, and is now part of a multi-media, general education curriculum at University of Southern California. Throughout this meeting, at the SCEC booth, UseIT interns will be demonstrating both the serious games and SCEC-VDO. SCEC/UseIT is a National Science Foundation Research Experience for Undergraduates site.
Using Geowall to Promote Undergraduate Research
NASA Astrophysics Data System (ADS)
Scec EIT Intern Team; Perry, S.; Jordan, T.
2003-12-01
The principal use of our Geowall system is to showcase the 3-D visualizations created by SCEC/EITR (Southern California Earthquake Center/Earthquake Information Technology Research) interns. These visualizations, called LA3D, are devised to educate the public, assist researchers, inspire students, and attract new interns. With the design criteria that LA3D code must be object-oriented and open-source, and that all datasets should be in internet-accessible databases, our interns have made interactive visualizations of southern California's earthquakes, faults, landforms, and other topographic features, that allow unlimited additions of new datasets and map objects. The interns built our Geowall system, and made a unique contribution to the Geowall consortium when they devised a simple way to use Java3D to create and send images to Geowall's projectors. The EIT interns are enormously proud of their accomplishments, and for most, working on LA3D has been the high point of their college careers. Their efforts have become central to testbed development of the system level science that SCEC is orchestrating in its Community Modeling Environment. In addition, SCEC's Communication, Education and Outreach Program uses LA3D on Geowall to communicate concepts about earthquakes and earthquake processes. Then, projecting LA3D on Geowall, it becomes easy to impress students from elementary to high school ages with what can be accomplished if they keep learning math and science. Finally, we bring Geowall to undergraduate research symposia and career-day open houses, to project LA3D and attract additional students to our intern program, which to date has united students in computer science, engineering, geoscience, mathematics, communication, pre-law, and cinema. (Note: distribution copies of LA3D will be available in early 2004.) The Southern California Earthquake Center Earthquake Information Technology Intern Team on this project: Adam Bongarzone, Hunter Francoeur, Lindsay Gordon, Nitin Gupta, Vipin Gupta, Jeff Hoeft, Shalini Jhatakia, Leonard Jimenez, Gideon Juve, Douglas Lam, Jed Link, Gavin Locke, Deepak Mehtani, Bill Paetzke, Nick Palmer, Brandee Pierce, Ryan Prose, Nitin Sharma, Ghunghroo Sinha, Jeremie Smith, Brandon Teel, Robert Weekly, Channing Wong, Jeremy Zechar.
Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.
2014-05-01
The development of effective mitigation strategies requires scientifically consistent estimates of seismic ground motion; recent analysis, however, showed that the performances of the classical probabilistic approach to seismic hazard assessment (PSHA) are very unsatisfactory in anticipating ground shaking from future large earthquakes. Moreover, due to their basic heuristic limitations, the standard PSHA estimates are by far unsuitable when dealing with the protection of critical structures (e.g. nuclear power plants) and cultural heritage, where it is necessary to consider extremely long time intervals. Nonetheless, the persistence in resorting to PSHA is often explained by the need to deal with uncertainties related with ground shaking and earthquakes recurrence. We show that current computational resources and physical knowledge of the seismic waves generation and propagation processes, along with the improving quantity and quality of geophysical data, allow nowadays for viable numerical and analytical alternatives to the use of PSHA. The advanced approach considered in this study, namely the NDSHA (neo-deterministic seismic hazard assessment), is based on the physically sound definition of a wide set of credible scenario events and accounts for uncertainties and earthquakes recurrence in a substantially different way. The expected ground shaking due to a wide set of potential earthquakes is defined by means of full waveforms modelling, based on the possibility to efficiently compute synthetic seismograms in complex laterally heterogeneous anelastic media. In this way a set of scenarios of ground motion can be defined, either at national and local scale, the latter considering the 2D and 3D heterogeneities of the medium travelled by the seismic waves. The efficiency of the NDSHA computational codes allows for the fast generation of hazard maps at the regional scale even on a modern laptop computer. At the scenario scale, quick parametric studies can be easily performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.
NASA Astrophysics Data System (ADS)
Stein, R. S.
2012-12-01
The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by our actions. Using these global datasets will help to make the model as uniform as possible. The model must be built by scientists in the affected countries with GEM's support, augmented by their insights and data. The model will launch in 2014; to succeed it must be open, international, independent, and continuously tested. But the mission of GEM is not just the likelihood of ground shaking, but also gaging the economic and social consequences of earthquakes, which greatly amplify the losses. For example, should the municipality of Istanbul retrofit schools, or increase its insurance reserves and recovery capacity? Should a homeowner in a high-risk area move or strengthen her building? This is why GEM is a public-private partnership. GEM's fourteen public sponsors and eight non-governmental organization members are standing for the developing world. To extend GEM into the financial world, we draw upon the expertise of companies. GEM's ten private sponsors have endorsed the acquisition of public knowledge over private gain. In a competitive world, this is a courageous act. GEM is but one link in a chain of preparedness: from earth science and engineering research, through groups like GEM, to mitigation, retrofit or relocate decisions, building codes and insurance, and finally to prepared hospitals, schools, and homes. But it is a link that our community can make strong.
Seismic hazard assessment: Issues and alternatives
Wang, Z.
2011-01-01
Seismic hazard and risk are two very important concepts in engineering design and other policy considerations. Although seismic hazard and risk have often been used inter-changeably, they are fundamentally different. Furthermore, seismic risk is more important in engineering design and other policy considerations. Seismic hazard assessment is an effort by earth scientists to quantify seismic hazard and its associated uncertainty in time and space and to provide seismic hazard estimates for seismic risk assessment and other applications. Although seismic hazard assessment is more a scientific issue, it deserves special attention because of its significant implication to society. Two approaches, probabilistic seismic hazard analysis (PSHA) and deterministic seismic hazard analysis (DSHA), are commonly used for seismic hazard assessment. Although PSHA has been pro-claimed as the best approach for seismic hazard assessment, it is scientifically flawed (i.e., the physics and mathematics that PSHA is based on are not valid). Use of PSHA could lead to either unsafe or overly conservative engineering design or public policy, each of which has dire consequences to society. On the other hand, DSHA is a viable approach for seismic hazard assessment even though it has been labeled as unreliable. The biggest drawback of DSHA is that the temporal characteristics (i.e., earthquake frequency of occurrence and the associated uncertainty) are often neglected. An alternative, seismic hazard analysis (SHA), utilizes earthquake science and statistics directly and provides a seismic hazard estimate that can be readily used for seismic risk assessment and other applications. ?? 2010 Springer Basel AG.
Out-of-plane (SH) soil-structure interaction: a shear wall with rigid and flexible ring foundation
NASA Astrophysics Data System (ADS)
Le, Thang; Lee, Vincent W.; Luo, Hao
2016-02-01
Soil-structure interaction (SSI) of a building and shear wall above a foundation in an elastic half-space has long been an important research subject for earthquake engineers and strong-motion seismologists. Numerous papers have been published since the early 1970s; however, very few of these papers have analytic closed-form solutions available. The soil-structure interaction problem is one of the most classic problems connecting the two disciplines of earthquake engineering and civil engineering. The interaction effect represents the mechanism of energy transfer and dissipation among the elements of the dynamic system, namely the soil subgrade, foundation, and superstructure. This interaction effect is important across many structure, foundation, and subgrade types but is most pronounced when a rigid superstructure is founded on a relatively soft lower foundation and subgrade. This effect may only be ignored when the subgrade is much harder than a flexible superstructure: for instance a flexible moment frame superstructure founded on a thin compacted soil layer on top of very stiff bedrock below. This paper will study the interaction effect of the subgrade and the superstructure. The analytical solution of the interaction of a shear wall, flexible-rigid foundation, and an elastic half-space is derived for incident SH waves with various angles of incidence. It found that the flexible ring (soft layer) cannot be used as an isolation mechanism to decouple a superstructure from its substructure resting on a shaking half-space.
NASA Astrophysics Data System (ADS)
Kunz-Plapp, T.; Khazai, B.; Daniell, J. E.
2012-04-01
This paper presents a new method for modeling health impacts caused by earthquake damage which allows for integrating key social impacts on individual health and health-care systems and for implementing these impacts in quantitative systemic seismic vulnerability analysis. In current earthquake casualty estimation models, demand on health-care systems is estimated by quantifying the number of fatalities and severity of injuries based on empirical data correlating building damage with casualties. The expected number of injured people (sorted by priorities of emergency treatment) is combined together with post-earthquake reduction of functionality of health-care facilities such as hospitals to estimate the impact on healthcare systems. The aim here is to extend these models by developing a combined engineering and social science approach. Although social vulnerability is recognized as a key component for the consequences of disasters, social vulnerability as such, is seldom linked to common formal and quantitative seismic loss estimates of injured people which provide direct impact on emergency health care services. Yet, there is a consensus that factors which affect vulnerability and post-earthquake health of at-risk populations include demographic characteristics such as age, education, occupation and employment and that these factors can aggravate health impacts further. Similarly, there are different social influences on the performance of health care systems after an earthquake both on an individual as well as on an institutional level. To link social impacts of health and health-care services to a systemic seismic vulnerability analysis, a conceptual model of social impacts of earthquakes on health and the health care systems has been developed. We identified and tested appropriate social indicators for individual health impacts and for health care impacts based on literature research, using available European statistical data. The results will be used to develop a socio-physical model of systemic seismic vulnerability that enhances the further understanding of societal seismic risk by taking into account social vulnerability impacts for health and health-care system, shelter, and transportation.
Uncertainties in Earthquake Loss Analysis: A Case Study From Southern California
NASA Astrophysics Data System (ADS)
Mahdyiar, M.; Guin, J.
2005-12-01
Probabilistic earthquake hazard and loss analyses play important roles in many areas of risk management, including earthquake related public policy and insurance ratemaking. Rigorous loss estimation for portfolios of properties is difficult since there are various types of uncertainties in all aspects of modeling and analysis. It is the objective of this study to investigate the sensitivity of earthquake loss estimation to uncertainties in regional seismicity, earthquake source parameters, ground motions, and sites' spatial correlation on typical property portfolios in Southern California. Southern California is an attractive region for such a study because it has a large population concentration exposed to significant levels of seismic hazard. During the last decade, there have been several comprehensive studies of most regional faults and seismogenic sources. There have also been detailed studies on regional ground motion attenuations and regional and local site responses to ground motions. This information has been used by engineering seismologists to conduct regional seismic hazard and risk analysis on a routine basis. However, one of the more difficult tasks in such studies is the proper incorporation of uncertainties in the analysis. From the hazard side, there are uncertainties in the magnitudes, rates and mechanisms of the seismic sources and local site conditions and ground motion site amplifications. From the vulnerability side, there are considerable uncertainties in estimating the state of damage of buildings under different earthquake ground motions. From an analytical side, there are challenges in capturing the spatial correlation of ground motions and building damage, and integrating thousands of loss distribution curves with different degrees of correlation. In this paper we propose to address some of these issues by conducting loss analyses of a typical small portfolio in southern California, taking into consideration various source and ground motion uncertainties. The approach is designed to integrate loss distribution functions with different degrees of correlation for portfolio analysis. The analysis is based on USGS 2002 regional seismicity model.
Examining the Use of the Cloud for Seismic Data Centers
NASA Astrophysics Data System (ADS)
Yu, E.; Meisenhelter, S.; Clayton, R. W.
2011-12-01
The Southern California Earthquake Data Center (SCEDC) archives seismic and station sensor metadata related to earthquake activity in southern California. It currently archives nearly 8400 data streams continuously from over 420 stations in near real time at a rate of 584 GB/month to a repository approximately 18 TB in size. Triggered waveform data from an average 12,000 earthquakes/year is also archived. Data are archived on mirrored disk arrays that are maintained and backed-up locally. These data are served over the Internet to scientists and the general public in many countries. The data demand has a steady component, largely needed for ambient noise correlation studies, and an impulsive component that is driven by earthquake activity. Designing a reliable, cost effective, system architecture equipped to handle periods of relatively low steady demand punctuated by unpredictable sharp spikes in demand immediately following a felt earthquake remains a major challenge. To explore an alternative paradigm, we have put one-month of the data in the "cloud" and have developed a user interface with the Google Apps Engine. The purpose is to assess the modifications in data structures that are necessary to make efficient searches. To date we have determined that the database schema must be "denormalized" to take advantage of the dynamic computational capabilities, and that it is likely advantageous to preprocess the waveform data to remove overlaps, gaps, and other artifacts. The final purpose of this study is to compare the cost of the cloud compared to ground-based centers. The major motivations for this study are the security and dynamic load capabilities of the cloud. In the cloud, multiple copies of the data are held in distributed centers thus eliminating the single point of failure associated with one center. The cloud can dynamically increase the level of computational resources during an earthquake, and the major tasks of managing a disk farm are eliminated. The center can also managed from anywhere and is not bound to a particular location.
Establishment of a Taiwan Marine cable hosted observatory (Ma-Cho project)
NASA Astrophysics Data System (ADS)
Lee, C.; Hsu, S.; Shin, T.
2006-12-01
Taiwan is located in a junction corner between the Philippine Sea Plate and Eurasian Plate. Because of the active convergence, numerous earthquakes have occurred in and around Taiwan. On average, there are about two earthquakes greater than magnitude 6 each year and 80% of earthquakes occurred in the offshore area. Because of the subduction of Philippine Sea Plate beneath the western end of the Ryukyu Arc and northern Taiwan, both the tectonics and seismic activity are intensive. The 2004 Sumatra earthquake has induced giant tsunami attacking coastal countries of South Asia. Due to a similar geodynamic context, the Sumatra event has aroused the attention of Taiwan government. Soon, specialists from Taiwan earth scientists and ocean engineers have teamed up to discuss the potential and mitigation of natural hazards from the western end of the Ryukyu subduction zone. The constructing a submarine cable observatory off eastern Taiwan (Ma-Cho project) was suggested. Ma-Cho means a sea goddess who protects people at sea. The purpose of Ma-Cho project has several folds. Firstly, the extension of seismic stations on land to offshore area can increase the resolution of earthquake locating. Secondly, the extension of seismic stations may obtain tens of second before the destructing seismic waves arrive on land or tens of minute before the arrival of giant tsunami, which is helpful for earthquake or tsunami warning. Thirdly, the seafloor scientific station can monitor the active volcanoes in the Okinawa Trough, which is directly adjacent to the Ilan plain in northeastern Taiwan. Fourthly, the seafloor observatory can be used to continuously study the Kurosho current, off eastern Taiwan. The Ma- Cho project has been granted for the first year. From 2007, we will start with a submarine route survey and a construction of the submarine cable land station. The main submarine cable frame and the connection of scientific instruments to cable nodes will be finished in 2009.
NASA Astrophysics Data System (ADS)
Wen, Strong; Chang, Yi-Zen; Yeh, Yu-Lien; Wen, Yi-Ying
2017-04-01
Due to the complicated geomorphology and geological conditions, the southwest (SW) Taiwan suffers the invasion of various natural disasters, such as landslide, mud flow and especially the threat of strong earthquakes as result of convergence between the Eurasian and the Philippine Sea plate. Several disastrous earthquakes had occurred in this area and often caused serious hazards. Therefore, it is fundamentally important to understand the correlation between seismic activity and seismogenic structures in SW Taiwan. Previous studies have indicated that before the failure of rock strength, the behaviors of micro-earthquakes can provide essential clues to help investigating the process of rock deformation. Thus, monitoring the activity of micro-earthquakes plays an important role in studying fault rupture or crustal deformation before the occurrence of a large earthquake. Because the time duration of micro-earthquakes activity can last for years, this phenomenon can be used to indicate the change of physical properties in the crust, such as crustal stress changes or fluid migration. The main purpose of this research is to perform a nonlinear waveform inversion to investigate source parameters of micro-earthquakes which include the non-double couple components owing to the shear rupture usually associated with complex morphology as well as tectonic fault systems. We applied a nonlinear waveform procedure to investigate local stress status and source parameters of micro-earthquakes that occurred in SW Taiwan. Previous studies has shown that microseismic fracture behaviors were controlled by the non-double components, which could lead to cracks generating and fluid migration, which can result in changing rock volume and produce partial compensation. Our results not only giving better understanding the seismogenic structures in the SW Taiwan, but also allowing us to detect variations of physical parameters caused by crack propagating in stratum. Thus, the derived source parameters can serve as a detail physical status (such as fluid migration, fault geometry and the pressure of the leading edge of the rupturing) to investigate the characteristics of seismongenic structures more precisely. In addition, the obtained regional stress field in this study also used to assure and to exam the tectonic models proposed for SW Taiwan previously, which will help to properly assess seismic hazard analysis for major engineering construction projects in the urban area.
Green, R.A.; Obermeier, S.F.; Olson, S.M.
2005-01-01
The greatest impediments to the widespread acceptance of back-calculated ground motion characteristics from paleoliquefaction studies typically stem from three uncertainties: (1) the significance of changes in the geotechnical properties of post-liquefied sediments (e.g., "aging" and density changes), (2) the selection of appropriate geotechnical soil indices from individual paleoliquefaction sites, and (3) the methodology for integration of back-calculated results of strength of shaking from individual paleoliquefaction sites into a regional assessment of paleoseismic strength of shaking. Presented herein are two case studies that illustrate the methods outlined by Olson et al. [Engineering Geology, this issue] for addressing these uncertainties. The first case study is for a site near Memphis, Tennessee, wherein cone penetration test data from side-by-side locations, one of liquefaction and the other of no liquefaction, are used to readily discern that the influence of post-liquefaction "aging" and density changes on the measured in situ soil indices is minimal. In the second case study, 12 sites that are at scattered locations in the Wabash Valley and that exhibit paleoliquefaction features are analyzed. The features are first provisionally attributed to the Vincennes Earthquake, which occurred around 6100 years BP, and are used to illustrate our proposed approach for selecting representative soil indices of the liquefied sediments. These indices are used in back-calculating the strength of shaking at the individual sites, the results from which are then incorporated into a regional assessment of the moment magnitude, M, of the Vincennes Earthquake. The regional assessment validated the provisional assumption that the paleoliquefaction features at the scattered sites were induced by the Vincennes Earthquake, in the main, which was determined to have M ??? 7.5. The uncertainties and assumptions used in the assessment are discussed in detail. ?? 2004 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Cakir, R.; Walsh, T. J.; Norman, D. K.
2017-12-01
We, Washington Geological Survey (WGS), have been performing multi-method near surface geophysical surveys to help assess potential earthquake damage at public schools in Washington. We have been conducting active and passive seismic surveys, and estimating Shear-wave velocity (Vs) profiles, then determining the NEHRP soil classifications based on Vs30m values at school sites in Washington. The survey methods we have used: 1D and 2D MASW and MAM, P- and S-wave refraction, horizontal-to-vertical spectral ratio (H/V), and 2ST-SPAC to measure Vs and Vp at shallow (0-70m) and greater depths at the sites. We have also run Ground Penetrating Radar (GPR) surveys at the sites to check possible horizontal subsurface variations along and between the seismic survey lines and the actual locations of the school buildings. The seismic survey results were then used to calculate Vs30m for determining the NEHRP soil classifications at school sites, thus soil amplification effects on the ground motions. Resulting shear-wave velocity profiles generated from these studies can also be used for site response and liquefaction potential studies, as well as for improvement efforts of the national Vs30m database, essential information for ShakeMap and ground motion modeling efforts in Washington and Pacific Northwest. To estimate casualties, nonstructural, and structural losses caused by the potential earthquakes in the region, we used these seismic site characterization results associated with structural engineering evaluations based on ASCE41 or FEMA 154 (Rapid Visual Screening) as inputs in FEMA Hazus-Advanced Engineering Building Module (AEBM) analysis. Compelling example surveys will be presented for the school sites in western and eastern Washington.
GIA induced intraplate seismicity in northern Central Europe
NASA Astrophysics Data System (ADS)
Brandes, Christian; Steffen, Holger; Steffen, Rebekka; Wu, Patrick
2015-04-01
Though northern Central Europe is regarded as a low seismicity area (Leydecker and Kopera, 1999), several historic earthquakes with intensities of up to VII affected the area in the last 1200 years (Leydecker, 2011). The trigger for these seismic events is not sufficiently investigated yet. Based on the combination of historic earthquake epicentres with the most recent fault maps we show that the historic seismicity concentrated at major reverse faults. There is no evidence for significant historic earthquakes along normal faults in northern Central Europe. The spatial and temporal distribution of earthquakes (clusters that shift from time to time) implies that northern Central Europe behaves like a typical intraplate tectonic region as demonstrated for other intraplate settings (Liu et al., 2000) We utilized Finite Element models that describe the process of glacial isostatic adjustment to analyse the fault behaviour. We use the change in Coulomb Failure Stress (dCFS) to represent the minimum stress required to reach faulting. A negative dCFS value indicates that the fault is stable, while a positive value means that GIA stress is potentially available to induce faulting or cause fault instability or failure unless released temporarily by an earthquake. The results imply that many faults in Central Europe are postglacial faults, though they developed outside the glaciated area. This is supported by the characteristics of the dCFS graphs, which indicate the likelihood that an earthquake is related to GIA. Almost all graphs show a change from negative to positive values during the deglaciation phase. This observation sheds new light on the distribution of post-glacial faults in general. Based on field data and the numerical simulations we developed the first consistent model that can explain the occurrence of deglaciation seismicity and more recent historic earthquakes in northern Central Europe. Based on our model, the historic seismicity in northern Central Europe can be regarded as a kind of aftershock sequence of the GIA induced-seismicity. References Leydecker, G. and Kopera, J.R. Seismological hazard assessment for a site in Northern Germany, an area of low seismicity. Engineering Geology 52, 293-304 (1999). Leydecker, G. Erdbebenkatalog für die Bundesrepublik Deutschland mit Randgebieten für die Jahre 800-2008. Geologisches Jahrbuch Reihe E, 198 pp., (2011) Liu, M., Stein, S. and Wang, H. 2000 years of migrating earthquakes in north China: How earthquakes in midcontinents differ from those at plate boundaries. Lithosphere 3, 128-132, (2011).
Updating the Seismic Hazard Determination in southeastern Brazil
NASA Astrophysics Data System (ADS)
Franca, G. S.; Algarte, K. T.
2012-12-01
This job presents an update of research by Berrocal in 1996 in the determination of seismic hazard for the Southeast of Brazil, based on the earthquake catalog compiled at the Instituto de Astronomia e Geofisica, Universidade de Sao Paulo and bulletin of Seismological Observatory, Universidade de Brasilia, during the period between 1767 until May 2012. The southeastern Brazil has a level of seismic activity is considered low, typical of intraplate regions. Our database has a total of 3726 events, however 1242 events do not have the magnitude estimated, 1638 events are between magnitudes 0.1 to 1.9 and from 2.0 to 3.9 are 819 events. The largest earthquake in the region occurred on February 28, 1955 with magnitude 6.1 mb (Assumpção, 2000), with its epicenter about 400 km from the coast, this was felt in small cities, especially in Espirito Santo State. The intensity VIII-IX MM was estimated by Berrocal et al. (1984). The database also has four events with magnitude above 5.0 mb in the region that occurred during the past 215 years and a little more than a twenty earthquakes with magnitude between 4.0 and 5.0 mb. Instrumental data are available since the 1970s when the station network was installed in Brasilia. Several other short-period vertical stations have been installed in the region. We used data from the same area defined in the previous survey, located between parallels 15S-32S degree and longitudes 35W-52W degree. It contains the most developed area of Brazil, and the major cities and industrial centers of the country (São Paulo, Rio de Janeiro and Belo Horizonte). Major engineering works, hydroelectric and nuclear power plant (Angra dos Reis) are also in this area. Therefore, the results can be applied to the planning and construction of large engineering within that region. With GIS and seismology tools was calculated relative frequency/magnitude for earthquakes mb > 3.0, the value of b with the maximum likelihood method, and so curves of recurrence was estimated seismic hazard in the region and seismic attenuation.
Site correction of stochastic simulation in southwestern Taiwan
NASA Astrophysics Data System (ADS)
Lun Huang, Cong; Wen, Kuo Liang; Huang, Jyun Yan
2014-05-01
Peak ground acceleration (PGA) of a disastrous earthquake, is concerned both in civil engineering and seismology study. Presently, the ground motion prediction equation is widely used for PGA estimation study by engineers. However, the local site effect is another important factor participates in strong motion prediction. For example, in 1985 the Mexico City, 400km far from the epicenter, suffered massive damage due to the seismic wave amplification from the local alluvial layers. (Anderson et al., 1986) In past studies, the use of stochastic method had been done and showed well performance on the simulation of ground-motion at rock site (Beresnev and Atkinson, 1998a ; Roumelioti and Beresnev, 2003). In this study, the site correction was conducted by the empirical transfer function compared with the rock site response from stochastic point-source (Boore, 2005) and finite-fault (Boore, 2009) methods. The error between the simulated and observed Fourier spectrum and PGA are calculated. Further we compared the estimated PGA to the result calculated from ground motion prediction equation. The earthquake data used in this study is recorded by Taiwan Strong Motion Instrumentation Program (TSMIP) from 1991 to 2012; the study area is located at south-western Taiwan. The empirical transfer function was generated by calculating the spectrum ratio between alluvial site and rock site (Borcheret, 1970). Due to the lack of reference rock site station in this area, the rock site ground motion was generated through stochastic point-source model instead. Several target events were then chosen for stochastic point-source simulating to the halfspace. Then, the empirical transfer function for each station was multiplied to the simulated halfspace response. Finally, we focused on two target events: the 1999 Chi-Chi earthquake (Mw=7.6) and the 2010 Jiashian earthquake (Mw=6.4). Considering the large event may contain with complex rupture mechanism, the asperity and delay time for each sub-fault is to be concerned. Both the stochastic point-source and the finite-fault model were used to check the result of our correction.
The Global Earthquake Model and Disaster Risk Reduction
NASA Astrophysics Data System (ADS)
Smolka, A. J.
2015-12-01
Advanced, reliable and transparent tools and data to assess earthquake risk are inaccessible to most, especially in less developed regions of the world while few, if any, globally accepted standards currently allow a meaningful comparison of risk between places. The Global Earthquake Model (GEM) is a collaborative effort that aims to provide models, datasets and state-of-the-art tools for transparent assessment of earthquake hazard and risk. As part of this goal, GEM and its global network of collaborators have developed the OpenQuake engine (an open-source software for hazard and risk calculations), the OpenQuake platform (a web-based portal making GEM's resources and datasets freely available to all potential users), and a suite of tools to support modelers and other experts in the development of hazard, exposure and vulnerability models. These resources are being used extensively across the world in hazard and risk assessment, from individual practitioners to local and national institutions, and in regional projects to inform disaster risk reduction. Practical examples for how GEM is bridging the gap between science and disaster risk reduction are: - Several countries including Switzerland, Turkey, Italy, Ecuador, Papua-New Guinea and Taiwan (with more to follow) are computing national seismic hazard using the OpenQuake-engine. In some cases these results are used for the definition of actions in building codes. - Technical support, tools and data for the development of hazard, exposure, vulnerability and risk models for regional projects in South America and Sub-Saharan Africa. - Going beyond physical risk, GEM's scorecard approach evaluates local resilience by bringing together neighborhood/community leaders and the risk reduction community as a basis for designing risk reduction programs at various levels of geography. Actual case studies are Lalitpur in the Kathmandu Valley in Nepal and Quito/Ecuador. In agreement with GEM's collaborative approach, all projects are undertaken with strong involvement of local scientific and risk reduction communities. Open-source software and careful documentation of the methodologies create full transparency of the modelling process, so that results can be reproduced any time by third parties.
NASA Astrophysics Data System (ADS)
Peruzza, Laura; Azzaro, Raffaele; Gee, Robin; D'Amico, Salvatore; Langer, Horst; Lombardo, Giuseppe; Pace, Bruno; Pagani, Marco; Panzera, Francesco; Ordaz, Mario; Suarez, Miguel Leonardo; Tusa, Giuseppina
2017-11-01
This paper describes the model implementation and presents results of a probabilistic seismic hazard assessment (PSHA) for the Mt. Etna volcanic region in Sicily, Italy, considering local volcano-tectonic earthquakes. Working in a volcanic region presents new challenges not typically faced in standard PSHA, which are broadly due to the nature of the local volcano-tectonic earthquakes, the cone shape of the volcano and the attenuation properties of seismic waves in the volcanic region. These have been accounted for through the development of a seismic source model that integrates data from different disciplines (historical and instrumental earthquake datasets, tectonic data, etc.; presented in Part 1, by Azzaro et al., 2017) and through the development and software implementation of original tools for the computation, such as a new ground-motion prediction equation and magnitude-scaling relationship specifically derived for this volcanic area, and the capability to account for the surficial topography in the hazard calculation, which influences source-to-site distances. Hazard calculations have been carried out after updating the most recent releases of two widely used PSHA software packages (CRISIS, as in Ordaz et al., 2013; the OpenQuake engine, as in Pagani et al., 2014). Results are computed for short- to mid-term exposure times (10 % probability of exceedance in 5 and 30 years, Poisson and time dependent) and spectral amplitudes of engineering interest. A preliminary exploration of the impact of site-specific response is also presented for the densely inhabited Etna's eastern flank, and the change in expected ground motion is finally commented on. These results do not account for M > 6 regional seismogenic sources which control the hazard at long return periods. However, by focusing on the impact of M < 6 local volcano-tectonic earthquakes, which dominate the hazard at the short- to mid-term exposure times considered in this study, we present a different viewpoint that, in our opinion, is relevant for retrofitting the existing buildings and for driving impending interventions of risk reduction.
Global Dynamic Exposure and the OpenBuildingMap
NASA Astrophysics Data System (ADS)
Schorlemmer, D.; Beutin, T.; Hirata, N.; Hao, K. X.; Wyss, M.; Cotton, F.; Prehn, K.
2015-12-01
Detailed understanding of local risk factors regarding natural catastrophes requires in-depth characterization of the local exposure. Current exposure capture techniques have to find the balance between resolution and coverage. We aim at bridging this gap by employing a crowd-sourced approach to exposure capturing focusing on risk related to earthquake hazard. OpenStreetMap (OSM), the rich and constantly growing geographical database, is an ideal foundation for us. More than 2.5 billion geographical nodes, more than 150 million building footprints (growing by ~100'000 per day), and a plethora of information about school, hospital, and other critical facility locations allow us to exploit this dataset for risk-related computations. We will harvest this dataset by collecting exposure and vulnerability indicators from explicitly provided data (e.g. hospital locations), implicitly provided data (e.g. building shapes and positions), and semantically derived data, i.e. interpretation applying expert knowledge. With this approach, we can increase the resolution of existing exposure models from fragility classes distribution via block-by-block specifications to building-by-building vulnerability. To increase coverage, we will provide a framework for collecting building data by any person or community. We will implement a double crowd-sourced approach to bring together the interest and enthusiasm of communities with the knowledge of earthquake and engineering experts. The first crowd-sourced approach aims at collecting building properties in a community by local people and activists. This will be supported by tailored building capture tools for mobile devices for simple and fast building property capturing. The second crowd-sourced approach involves local experts in estimating building vulnerability that will provide building classification rules that translate building properties into vulnerability and exposure indicators as defined in the Building Taxonomy 2.0 developed by the Global Earthquake Model (GEM). These indicators will then be combined with a hazard model using the GEM OpenQuake engine to compute a risk model. The free/open framework we will provide can be used on commodity hardware for local to regional exposure capturing and for communities to understand their earthquake risk.
Strong Motion Recording in the United States
NASA Astrophysics Data System (ADS)
Archuleta, R. J.; Fletcher, J. B.; Shakal, A. F.
2014-12-01
The United States strong motion program began in 1932 when the Coast and Geodetic Survey (C&GS) installed eight strong motion accelerographs in California. During the March 1933 Long Beach earthquake, three of these produced the first strong motion records. With this success the C&GS expanded the number of accelerographs to 71 by 1964. With development of less expensive, mass-produced accelerographs the number of strong motion accelerographs expanded to ~575 by 1972. Responsibilities for operating the network and disseminating data were transferred to the National Oceanic and Atmospheric Administration in 1970 and then to the U.S. Geological Survey in 1973. In 1972 the California Legislature established the California Strong Motion Instrumentation Program (CSMIP). CSMIP operates accelerographs at 812 ground stations, with multi-channel accelerographs in 228 buildings, 125 lifelines and 37 geotechnical arrays, in California. The USGS and the ANSS effort operate accelerographs at 1584 ground stations, 96 buildings, 14 bridges, 70 dams, and 15 multi-channel geotechnical arrays. The USC Los Angeles array has 78 ground stations; UCSB operates 5 geotechnical arrays; other government and private institutions also operate accelerographs. Almost all accelerographs are now digital with a sampling rate of 200 Hz. Most of the strong motion data can be downloaded from the Center for Engineering Strong Motion Data (http://strongmotioncenter.org). As accelerographs have become more sophisticated, the concept of what constitutes strong motion has blurred because small earthquakes (M ~3) are well recorded on accelerometers as well as seismometers. However, when accelerations are over ~10%g and velocities over ~1 cm/s, the accelerometers remain on scale, providing the unclipped data necessary to analyze the ground motion and its consequences. Strong motion data are essential to the development of ground motion prediction equations, understanding structural response, performance based engineering, soil response, and inversions for earthquake rupture parameters. While an important number of stations have been installed, many areas of the US are significantly deficient, e.g., recordings were obtained from only 2 stations within 60 km of the Mineral earthquake that damaged the nation's capital and other areas.
NASA Astrophysics Data System (ADS)
Karabulut, Savaş
2018-03-01
The study area is located in the northern part of Izmir, Western Turkey, prone to an active tectonic extensional regime and includes typical features of sedimentary basins, horst-grabens surrounded by a series of normal and strike-slip faults. In September 1939 the Dikili (Kabakum) earthquake with a magnitude of Mw: 6.6 occurred and after this phenomenon, residents moved from the west of Dikili to the east (i.d. soft sediments to relative to rock area). A proper estimate of the earthquake-related hazard for the area is the main objective of this study. The site effect and soil engineering problems for estimating hazard parameters at the soil surface need to be carefully analyzed for seismic site classification and geo-engineering problems like soil liquefaction, soil settlement, soil bearing capacity and soil amplification. To solve the soil static and dynamic problems, shear-wave velocities have been used in a joint interpretation process; Multichannel Analysis of Surface Waves (MASW) and Refraction Microtremor (ReMi) analyses were conducted on 121 sites with 300 × 300 m grid size in an area of 60 km2. It has been proposed that the probability of an earthquake with a magnitude of Mw: 6 occurring within 10 years is 64%, when considering the Gutenberg-Richter model. This puts the region under an important earthquake risk. The estimated Vs30 values are ≤180 m/s in the central and the northernmost part of the study area are showing an E type soil after the classification of NEHRP, where alluvial deposits are dominant. Vs30 values in the north and central part are between 180 ≤ Vs ≤ 360 m/s suggesting a D type soil. In the southernmost part of the study area where volcanic rocks are widely distributed, Vs30 values range between 360 and 908 m/s, corresponding to a C type and B type soil. The results show that soil liquefaction induced settlement and soil amplification are the most important problems in the south and the northernmost part of the study area, which is densely populated and encompasses the urbanized part of the study region.
NASA Astrophysics Data System (ADS)
Marmureanu, Gheorghe; Ortanza Cioflan, Carmen; Marmureanu, Alexandru
2010-05-01
Nonlinear effects in ground motion during large earthquakes have long been a controversial issue between seismologists and geotechnical engineers. Aki wrote in 1993:"Nonlinear amplification at sediments sites appears to be more pervasive than seismologists used to think…Any attempt at seismic zonation must take into account the local site condition and this nonlinear amplification( Local site effects on weak and strong ground motion, Tectonophysics,218,93-111). In other words, the seismological detection of the nonlinear site effects requires a simultaneous understanding of the effects of earthquake source, propagation path and local geological site conditions. The difficulty for seismologists in demonstrating the nonlinear site effects has been due to the effect being overshadowed by the overall patterns of shock generation and path propagation. The researchers from National Institute for Earth Physics ,in order to make quantitative evidence of large nonlinear effects, introduced the spectral amplification factor (SAF) as ratio between maximum spectral absolute acceleration (Sa), relative velocity (Sv) , relative displacement (Sd) from response spectra for a fraction of critical damping at fundamental period and peak values of acceleration(a-max),velocity (v-max) and displacement (d-max),respectively, from processed strong motion record and pointed out that there is a strong nonlinear dependence on earthquake magnitude and site conditions.The spectral amplification factors(SAF) are finally computed for absolute accelerations at 5% fraction of critical damping (β=5%) in five seismic stations: Bucharest-INCERC(soft soils, quaternary layers with a total thickness of 800 m);Bucharest-Magurele (dense sand and loess on 350m); Cernavoda Nuclear Power Plant site (marl, loess, limestone on 270 m) Bacau(gravel and loess on 20m) and Iassy (loess, sand, clay, gravel on 60 m) for last strong and deep Vrancea earthquakes: March 4,1977 (MGR =7.2 and h=95 km);August 30,1986(MGR =7.0 and h=130 km);May 30,1990 (MGR =6.7 and h=90 km) and May 31,1990 (MGR =6.1 and h=87 km). With a view to understand the characteristics of nonlinear soil behavior and the nonlinearity in the seismology and the influence to hazard and risk assessment ,this study examined the ways that nonlinearity would expected to appear on strong motion records made on Romania territory during to last Vrancea earthquake. The effect on nonlinearity is very large. For example, if we maintain the same amplification factor (SAF=5.8942) as for relatively strong earthquake on May 31,1990 with magnitude Ms =6,1 then at Bacau seismic station for earthquake on May 30,1990 (MGR =6.7) the peak acceleration has to be a*max =0.154g( +14.16%) and the actual recorded was only, a max =0.135g. Also, for Vrancea earthquake on August 30,1986, the peak acceleration has to be a*max=0.107g (+45,57%), instead of real value of 0.0736 g recorded at Bacau seismic station. More, the spectral amplification factors(SAF) are function of earthquake magnitude and there is a strong nonlinear dependence of the SAF of earthquake magnitude. The median values of SAF of the last strong Vrancea earthquakes for damping 5% are: 4.16; 3.63 and 3.26 corresponding to May 31,1990 Vrancea earthquake (Ms=6.1),May 30,1990 Vrancea earthquake(Ms=6.7),respectively, August 30,1986 Vrancea one(Ms=7.0). At the same seismic station, for example at Bacau, for 5% damping, SAF for accelerations is 5.22 for May 31,1990 earthquake (Ms =6.1);4.32 for May 30,1990 earthquake (Ms =6.7) and 3,94 for August 30,1986 one (Ms=7.0) etc. Finally, it will be made a comment in connection to U.S. Atomic Energy Commission-Regulatory Guide 1.60 on "Design Response Spectra for seismic design of nuclear power plants " to see spectral amplification factors for deep Vrancea earthquakes are larger and different.
NASA Astrophysics Data System (ADS)
Wyss, B. M.; Wyss, M.
2007-12-01
We estimate that the city of Rangoon and adjacent provinces (Rangoon, Rakhine, Ayeryarwady, Bago) represent an earthquake risk similar in severity to that of Istanbul and the Marmara Sea region. After the M9.3 Sumatra earthquake of December 2004 that ruptured to a point north of the Andaman Islands, the likelihood of additional ruptures in the direction of Myanmar and within Myanmar is increased. This assumption is especially plausible since M8.2 and M7.9 earthquakes in September 2007 extended the 2005 ruptures to the south. Given the dense population of the aforementioned provinces, and the fact that historically earthquakes of M7.5 class have occurred there (in 1858, 1895 and three in 1930), it would not be surprising, if similar sized earthquakes would occur in the coming decades. Considering that we predicted the extent of human losses in the M7.6 Kashmir earthquake of October 2005 approximately correctly six month before it occurred, it seems reasonable to attempt to estimate losses in future large to great earthquakes in central Myanmar and along its coast of the Bay of Bengal. We have calculated the expected number of fatalities for two classes of events: (1) M8 ruptures offshore (between the Andaman Islands and the Myanmar coast, and along Myanmar's coast of the Bay of Bengal. (2) M7.5 repeats of the historic earthquakes that occurred in the aforementioned years. These calculations are only order of magnitude estimates because all necessary input parameters are poorly known. The population numbers, the condition of the building stock, the regional attenuation law, the local site amplification and of course the parameters of future earthquakes can only be estimated within wide ranges. For this reason, we give minimum and maximum estimates, both within approximate error limits. We conclude that the M8 earthquakes located offshore are expected to be less harmful than the M7.5 events on land: For M8 events offshore, the minimum number of fatalities is estimated as 700 ± 200 and the maximum is estimated as 13,000 ± 6,000. For repeats of the historic M7.5 or similar earthquakes, the minimum is 4,000 ± 2,000 and the maximum is 63,000 ± 27,000. An exception is a repeat of the M7.5 earthquake of 1895 beneath the capital Rangoon that is estimated to have a population of about 4.7 million. In the case of a repeat of the 1895 event, a minimum of 100,000 and a maximum of 1 106 fatalities would have to be expected. The number of injured can in all cases be assumed to equal about double the number of fatalities. Although it is not very likely that the 1895 event would be repeated in the same location, it is clear that any medium to large earthquake in the vicinity of Rangoon (at a distance similar to the M7.2 earthquake of May 1930) could cause a major disaster with more than 10,000 fatalities. In spite of the uncertainties in these estimates, it is clear that the capital of Myanmar, and the provinces surrounding it, will likely experience major earthquake disasters in the future and the probability that these could occur during the next decades is increased. We conclude that major efforts of mitigation, using earthquake engineering techniques, and preparation for seismological early-warning capabilities should be undertaken in and near Rangoon, as well as in other cities with more than 100,000 inhabitants (e.g., Phatein, Bago and Henzada).
Scientific and non-scientific challenges for Operational Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Marzocchi, W.
2015-12-01
Tracking the time evolution of seismic hazard in time windows shorter than the usual 50-years of long-term hazard models may offer additional opportunities to reduce the seismic risk. This is the target of operational earthquake forecasting (OEF). During the OEF development in Italy we identify several challenges that range from pure science to the more practical interface of science with society. From a scientific point of view, although earthquake clustering is the clearest empirical evidence about earthquake occurrence, and OEF clustering models are the most (successfully) tested hazard models in seismology, we note that some seismologists are still reluctant to accept their scientific reliability. After exploring the motivations of these scientific doubts, we also look into an issue that is often overlooked in this discussion, i.e., in any kind of hazard analysis, we do not use a model because it is the true one, but because it is the better than anything else we can think of. The non-scientific aspects are mostly related to the fact that OEF usually provides weekly probabilities of large eartquakes smaller than 1%. These probabilities are considered by some seismologists too small to be of interest or useful. However, in a recent collaboration with engineers we show that such earthquake probabilities may lead to intolerable individual risk of death. Interestingly, this debate calls for a better definition of the still fuzzy boundaries among the different expertise required for the whole risk mitigation process. The last and probably more pressing challenge is related to the communication to the public. In fact, a wrong message could be useless or even counterproductive. Here we show some progresses that we have made in this field working with communication experts in Italy.
Celebi, M.
2006-01-01
An integrated seismic monitoring system with a total of 53 channels of accelerometers is now operating in and at the nearby free-field site of the 20-story steel-framed Atwood Building in highly seismic Anchorage, Alaska. The building has a single-story basement and a reinforced concrete foundation without piles. The monitoring system comprises a 32-channel structural array and a 21-channel site array. Accelerometers are deployed on 10 levels of the building to assess translational, torsional, and rocking motions, interstory drift (displacement) between selected pairs of adjacent floors, and average drift between floors. The site array, located approximately a city block from the building, comprises seven triaxial accelerometers, one at the surface and six in boreholes ranging in depths from 15 to 200 feet (???5-60 meters). The arrays have already recorded low-amplitude shaking responses of the building and the site caused by numerous earthquakes at distances ranging from tens to a couple of hundred kilometers. Data from an earthquake that occurred 186 km away traces the propagation of waves from the deepest borehole to the roof of the building in approximately 0.5 seconds. Fundamental structural frequencies [0.58 Hz (NS) and 0.47 Hz (EW)], low damping percentages (2-4%), mode coupling, and beating effects are identified. The fundamental site frequency at approximately 1.5 Hz is close to the second modal frequencies (1.83 Hz NS and 1.43 EW) of the building, which may cause resonance of the building. Additional earthquakes prove repeatability of these characteristics; however, stronger shaking may alter these conclusions. ?? 2006, Earthquake Engineering Research Institute.
A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta
NASA Astrophysics Data System (ADS)
Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O.
2015-12-01
Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events.
Lessons of L'Aquila for Operational Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2012-12-01
The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms and failures-to-predict. The best way to achieve this separation is to use probabilistic rather than deterministic statements in characterizing short-term changes in seismic hazards. The ICEF recommended establishing OEF systems that can provide the public with open, authoritative, and timely information about the short-term probabilities of future earthquakes. Because the public needs to be educated into the scientific conversation through repeated communication of probabilistic forecasts, this information should be made available at regular intervals, during periods of normal seismicity as well as during seismic crises. In an age of nearly instant information and high-bandwidth communication, public expectations regarding the availability of authoritative short-term forecasts are rapidly evolving, and there is a greater danger that information vacuums will spawn informal predictions and misinformation. L'Aquila demonstrates why the development of OEF capabilities is a requirement, not an option.
GEM - The Global Earthquake Model
NASA Astrophysics Data System (ADS)
Smolka, A.
2009-04-01
Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a coordinated global network of regional centers, with a high degree of interaction among the centers and the central secretariat. Broad acceptance of the models will be ensured by including local knowledge in all aspects of hazard and risk assessment and securing participation of local experts throughout development. All GEM efforts will be carried out using a common global software infrastructure and consensus standards. In accordance with principles of open-source development, and to ensure comprehensive global representation, contributions are welcomed and encouraged from a broad group of participants. To ensure uniformity and conformance with the highest scientific standards, all contributions, including models, tools, and data, will be rigorously vetted and independently tested. Recently the EUCENTRE in Pavia/Italy has been selected as the host institution of the GEM secretariat. The project will formally launch in early 2009 by creating the non-profit GEM foundation. While GEM serves a humanitarian imperative it is considered as offering a key to long-term economic development. GEM will enhance risk awareness at global, national and local scales. Greater risk awareness is a precondition for motivating public and private parties to investing into risk reduction and loss prevention, and to promote a greater use of financial risk transfer instruments.
Newell, Wayne L.; Stone, B.; Harrison, R.; ,
2004-01-01
Holocene alluvium of the Pedhicos River around Lefkosia (Nicosia), Cyprus, was studied. Alluvial stratigraphy was found to present serial flood deposits underlying river terraces and an extensive alluvial fan. It was found that the stratigraphy and geomorphology of the alluvium can be interpreted to distinguish not only the effects of climate change, but also land-use change, and the impact of particular engineering works. It was suggested that details of the physical properties of the flood deposit sequences and paleosols can contribute to modeling various geophysical and engineering properties and in predicting response to vertical acceleration during earthquakes.
Performance of Buildings in the 2009 Western Sumatra Earthquake
NASA Astrophysics Data System (ADS)
Deierlein, G.; Hart, T.; Alexander, N.; Hausler, E.; Henderson, S.; Wood, K.; Cedillos, V.; Wijanto, S.; Cabrera, C.; Rudianto, S.
2009-12-01
The M7.6 earthquake of 30 September 2009 in Western Sumatra, Indonesia caused significant damage and collapse to hundreds of buildings and the deaths of 1,117 people. In Padang City, with a population of about 900,000 people, building collapse was the primary cause of deaths and serious injuries (313 deaths and 431 serious injuries). The predominant building construction types in Padang are concrete moment frames with brick infill and masonry bearing wall systems. Concrete frames are common in multistory commercial retail buildings, offices, schools, and hotels; and masonry bearing wall systems are primarily used in low-rise (usually single story) residential and school buildings. In general, buildings that collapsed did not conform to modern seismic engineering practices that are required by the current Indonesian building code and would be expected in regions of moderate to high seismicity. While collapse of multi-story concrete buildings was more prevalent in older buildings (more than 10 years old), there were several newer buildings that collapsed. Primary deficiencies identified in collapsed or severely damaged buildings included: (a) soft or weak stories that failed in either by sidesway mechanisms or shear failures followed by loss of axial capacity of columns, (b) lack of ductile reinforcing bar detailing in concrete beams, columns, and beam-column joints, (c) poor quality concrete and mortar materials and workmanship, (d) vulnerable building configurations and designs with incomplete or deficient load paths, and (e) out-of-plane wall failures in unreinforced (or marginally reinforced) masonry. While these deficiencies may be expected in older buildings, damage and collapse to some modern (or recently rennovated buildings) indicates a lack of enforcement of building code provisions for design and construction quality assurance. Many new buildings whose structural systems were undamaged were closed due to extensive earthquake damage to brick infill walls, glass facades, ceiling systems and other architectural finishes. These demonstrated the importance of considering deformation compatibility and seismic considerations in the design and detail of architectural elements and non-structural components. Another important lesson learned from this earthquake is the critical role that buildings serve for vertical evacuation (refuge) from tsunami inundation in Padang and similar coastal cities in regions of high tsunami hazards. Severe traffic congestion immediately after the September 30 earthquake demonstrated that horizontal evacuation alone is insufficient to safely evacuate Padang City residents to high ground. Therefore, efforts must be stepped up to pre-screen, assess, and engineer buildings tha can be utilized for vertical evacuation.
Building configuration and seismic design: The architecture of earthquake resistance
NASA Astrophysics Data System (ADS)
Arnold, C.; Reitherman, R.; Whitaker, D.
1981-05-01
The architecture of a building in relation to its ability to withstand earthquakes was determined. Aspects of round motion which are significant to building behavior are discussed. Results of a survey of configuration decisions that affect the performance of buildings with a focus on the architectural aspects of configuration design are provided. Configuration derivation, building type as it relates to seismic design, and seismic design, and seismic issues in the design process are examined. Case studies of the Veterans' Administration Hospital in Loma Linda, California, and the Imperial Hotel in Tokyo, Japan, are presented. The seismic design process is described paying special attention to the configuration issues. The need is stressed for guidelines, codes, and regulations to ensure design solutions that respect and balance the full range of architectural, engineering, and material influences on seismic hazards.
Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig
2008-01-01
The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.
Charles Darwin's earthquake reports
NASA Astrophysics Data System (ADS)
Galiev, Shamil
2010-05-01
As it is the 200th anniversary of Darwin's birth, 2009 has also been marked as 170 years since the publication of his book Journal of Researches. During the voyage Darwin landed at Valdivia and Concepcion, Chile, just before, during, and after a great earthquake, which demolished hundreds of buildings, killing and injuring many people. Land was waved, lifted, and cracked, volcanoes awoke and giant ocean waves attacked the coast. Darwin was the first geologist to observe and describe the effects of the great earthquake during and immediately after. These effects sometimes repeated during severe earthquakes; but great earthquakes, like Chile 1835, and giant earthquakes, like Chile 1960, are rare and remain completely unpredictable. This is one of the few areas of science, where experts remain largely in the dark. Darwin suggested that the effects were a result of ‘ …the rending of strata, at a point not very deep below the surface of the earth…' and ‘…when the crust yields to the tension, caused by its gradual elevation, there is a jar at the moment of rupture, and a greater movement...'. Darwin formulated big ideas about the earth evolution and its dynamics. These ideas set the tone for the tectonic plate theory to come. However, the plate tectonics does not completely explain why earthquakes occur within plates. Darwin emphasised that there are different kinds of earthquakes ‘...I confine the foregoing observations to the earthquakes on the coast of South America, or to similar ones, which seem generally to have been accompanied by elevation of the land. But, as we know that subsidence has gone on in other quarters of the world, fissures must there have been formed, and therefore earthquakes...' (we cite the Darwin's sentences following researchspace. auckland. ac. nz/handle/2292/4474). These thoughts agree with results of the last publications (see Nature 461, 870-872; 636-639 and 462, 42-43; 87-89). About 200 years ago Darwin gave oneself airs by the problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.
Rapid estimation of the economic consequences of global earthquakes
Jaiswal, Kishor; Wald, David J.
2011-01-01
The U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, operational since mid 2007, rapidly estimates the most affected locations and the population exposure at different levels of shaking intensities. The PAGER system has significantly improved the way aid agencies determine the scale of response needed in the aftermath of an earthquake. For example, the PAGER exposure estimates provided reasonably accurate assessments of the scale and spatial extent of the damage and losses following the 2008 Wenchuan earthquake (Mw 7.9) in China, the 2009 L'Aquila earthquake (Mw 6.3) in Italy, the 2010 Haiti earthquake (Mw 7.0), and the 2010 Chile earthquake (Mw 8.8). Nevertheless, some engineering and seismological expertise is often required to digest PAGER's exposure estimate and turn it into estimated fatalities and economic losses. This has been the focus of PAGER's most recent development. With the new loss-estimation component of the PAGER system it is now possible to produce rapid estimation of expected fatalities for global earthquakes (Jaiswal and others, 2009). While an estimate of earthquake fatalities is a fundamental indicator of potential human consequences in developing countries (for example, Iran, Pakistan, Haiti, Peru, and many others), economic consequences often drive the responses in much of the developed world (for example, New Zealand, the United States, and Chile), where the improved structural behavior of seismically resistant buildings significantly reduces earthquake casualties. Rapid availability of estimates of both fatalities and economic losses can be a valuable resource. The total time needed to determine the actual scope of an earthquake disaster and to respond effectively varies from country to country. It can take days or sometimes weeks before the damage and consequences of a disaster can be understood both socially and economically. The objective of the U.S. Geological Survey's PAGER system is to reduce this time gap to more rapidly and effectively mobilize response. We present here a procedure to rapidly and approximately ascertain the economic impact immediately following a large earthquake anywhere in the world. In principle, the approach presented is similar to the empirical fatality estimation methodology proposed and implemented by Jaiswal and others (2009). In order to estimate economic losses, we need an assessment of the economic exposure at various levels of shaking intensity. The economic value of all the physical assets exposed at different locations in a given area is generally not known and extremely difficult to compile at a global scale. In the absence of such a dataset, we first estimate the total Gross Domestic Product (GDP) exposed at each shaking intensity by multiplying the per-capita GDP of the country by the total population exposed at that shaking intensity level. We then scale the total GDP estimated at each intensity by an exposure correction factor, which is a multiplying factor to account for the disparity between wealth and/or economic assets to the annual GDP. The economic exposure obtained using this procedure is thus a proxy estimate for the economic value of the actual inventory that is exposed to the earthquake. The economic loss ratio, defined in terms of a country-specific lognormal cumulative distribution function of shaking intensity, is derived and calibrated against the losses from past earthquakes. This report describes the development of a country or region-specific economic loss ratio model using economic loss data available for global earthquakes from 1980 to 2007. The proposed model is a potential candidate for directly estimating economic losses within the currently-operating PAGER system. PAGER's other loss models use indirect methods that require substantially more data (such as building/asset inventories, vulnerabilities, and the asset values exposed at the time of earthquake) to implement on a global basis and will thus take more time to develop and implement within the PAGER system.
Maximum spectral demands in the near-fault region
Huang, Y.-N.; Whittaker, A.S.; Luco, N.
2008-01-01
The Next Generation Attenuation (NGA) relationships for shallow crustal earthquakes in the western United States predict a rotated geometric mean of horizontal spectral demand, termed GMRotI50, and not maximum spectral demand. Differences between strike-normal, strike-parallel, geometric-mean, and maximum spectral demands in the near-fault region are investigated using 147 pairs of records selected from the NGA strong motion database. The selected records are for earthquakes with moment magnitude greater than 6.5 and for closest site-to-fault distance less than 15 km. Ratios of maximum spectral demand to NGA-predicted GMRotI50 for each pair of ground motions are presented. The ratio shows a clear dependence on period and the Somerville directivity parameters. Maximum demands can substantially exceed NGA-predicted GMRotI50 demands in the near-fault region, which has significant implications for seismic design, seismic performance assessment, and the next-generation seismic design maps. Strike-normal spectral demands are a significantly unconservative surrogate for maximum spectral demands for closest distance greater than 3 to 5 km. Scale factors that transform NGA-predicted GMRotI50 to a maximum spectral demand in the near-fault region are proposed. ?? 2008, Earthquake Engineering Research Institute.
A new method to assess damage to RCMRFs from period elongation and Park-Ang damage index using IDA
NASA Astrophysics Data System (ADS)
Aghagholizadeh, Mehrdad; Massumi, Ali
2016-09-01
Despite a significant progress in loading and design codes of seismic resistant structures and technology improvements in building structures, the field of civil engineering is still facing critical challenges. An example of those challenges is the assessment of the state of damage that has been imposed to a structure after earthquakes of different intensities. To determine the operability of a structure and its resistance to probable future earthquakes, quick assessment of damages and determining the operability of a structure after an earthquake are crucial. Present methods to calculate damage to structures are time consuming and do not accurately provide the rate of damage. Damage estimation is important task in the fields of structural health monitoring and decision-making. This study examines the relationship between period elongation and the Park-Ang damage index. A dynamic non-linear analysis is employed with IDARC program to calculate the amount of damage and period of the current state. This new method is shown to be a quick and accurate technique for damage assessment. It is easy to calculate the period of an existing structure and changes in the period which reflects changes in the stiffness matrix.
Development of regional liquefaction-induced deformation hazard maps
Rosinski, A.; Knudsen, K.-L.; Wu, J.; Seed, R.B.; Real, C.R.; ,
2004-01-01
This paper describes part of a project to assess the feasibility of producing regional (1:24,000-scale) liquefaction hazard maps that are based-on potential liquefaction-induced deformation. The study area is the central Santa Clara Valley, at the south end of San Francisco Bay in Central California. The information collected and used includes: a) detailed Quaternary geological mapping, b) over 650 geotechnical borings, c) probabilistic earthquake shaking information, and d) ground-water levels. Predictions of strain can be made using either empirical formulations or numerical simulations. In this project lateral spread displacements are estimated and new empirical relations to estimate future volumetric and shear strain are used. Geotechnical boring data to are used to: (a) develop isopach maps showing the thickness of sediment thatis likely to liquefy and deform under earthquake shaking; and (b) assess the variability in engineering properties within and between geologic map units. Preliminary results reveal that late Holocene deposits are likely to experience the greatest liquefaction-induced strains, while Holocene and late Pleistocene deposits are likely to experience significantly less horizontal and vertical strain in future earthquakes. Development of maps based on these analyses is feasible.
Simulation of ground motion using the stochastic method
Boore, D.M.
2003-01-01
A simple and powerful method for simulating ground motions is to combine parametric or functional descriptions of the ground motion's amplitude spectrum with a random phase spectrum modified such that the motion is distributed over a duration related to the earthquake magnitude and to the distance from the source. This method of simulating ground motions often goes by the name "the stochastic method." It is particularly useful for simulating the higher-frequency ground motions of most interest to engineers (generally, f>0.1 Hz), and it is widely used to predict ground motions for regions of the world in which recordings of motion from potentially damaging earthquakes are not available. This simple method has been successful in matching a variety of ground-motion measures for earthquakes with seismic moments spanning more than 12 orders of magnitude and in diverse tectonic environments. One of the essential characteristics of the method is that it distills what is known about the various factors affecting ground motions (source, path, and site) into simple functional forms. This provides a means by which the results of the rigorous studies reported in other papers in this volume can be incorporated into practical predictions of ground motion.
1989-10-17
An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.
Harp, Edwin L.; Jibson, Randall W.
2002-01-01
Anomalously high concentrations of rock falls were triggered in Pacoima Canyon (Los Angeles, California) during the 1994 Northridge earthquake. Similar concentrations were also documented from the 1971 San Fernando earthquake. Using an engineering rock-mass classification that evaluates the susceptibility of rock slopes to seismic failure based on the fracture properties of a rock mass (in terms of a numerical "Q-value" that describes rock quality), the rock slopes in Pacoima Canyon were compared with rock slopes in sorrounding areas where topography and lithology are similar, but rock-fall concentrations from the earthquakes were much lower. A statistical comparison of Q-values from five sites surrounding Pacoima Canyon indicates that seismic susceptibilities are similar to those within Pacoima Canyon; differences in the characteristics of rock slopes between these sites are not sufficient to account for the relatively high concentrations of rock falls within Pacoima Canyon as compared to low concentrations elsewhere. By eliminating susceptibility differences as a cause, the most likely explanations for the differences in rock-fall concentrations is anomalously high shaking levels in Pacoima Canyon, possibly resulting from topographic amplification within the canyon.
The utilization of brick walls for resisting earthquake in building technology
NASA Astrophysics Data System (ADS)
Tarigan, J.; Benedicta, C.
2018-03-01
Many structures in Indonesia use reinforced concrete frames with brick walls as their infill. Commonly, the engineers consider brick walls as the partitions and count them as the non-structural elements in the structure design. However, brick walls are capable of resisting earthquake by yielding high stiffness to the structure in case the brick walls are integrated well with the frames. It will reduce the non-structural destructions that happen to structures which is one of the most frequently impacts in the earthquake. This paper will take the effects of applying brick walls as the structural elements up by comparing it with the structure using brick walls as the partitions. The modeling of the brick walls uses the equivalent spectrum method meanwhile the seismic analysis uses the respon spectrum method. The utilization of brick walls can cause the decrement of the natural period to 42%. It also reduce the structure displacements to 53% in X-direction and 67% in Y-direction and the story drifts to 57% in X-direction and 71% in Y-direction. Otherwise, it causes the increment of the base shear only up to 3% in X-direction and 7% in Y-direction.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Biasi, Glenn; Anderson, John G
The parameter kappa was defined by Anderson and Hough (1984) to describe the high-frequency spectral roll-off of the strong motion seismic spectrum. In the work of Su et al., (1996) the numerical value of kappa estimated for sites near Yucca Mountain was small (~20 ms). The estimate obtained from these events has been applied through a rigorous methodology to develop design earthquake spectra with magnitude over 5.0. Smaller values of kappa lead to higher estimated ground motions in the methodology used by the Probabilistic Seismic Hazard Analysis (PSHA) for Yucca Mountain. An increase of 10 ms in kappa could resultmore » in a substantial decrease in the high frequency level of the predicted ground motions. Any parameter that plays such a critical role deserves close examination. Here, we study kappa and its associated uncertainties. The data set used by Su et al (1996) consisted of 12 M 2.8 to 4.5 earthquakes recorded at temporary stations deployed after the June 1992 Little Skull Mountain earthquake. The kappa elements of that study were revisited by Anderson and Su (MOL.20071203.0134) and substantially confirmed. One weakness of those studies is the limited data used. Few of these stations were on tuff or on Yucca Mountain itself. A decade of Southern Great Basin Digital Seismic Network (SGBDSN) recording has now yielded a larger body of on-scale, well calibrated digital ground motion records suitable for investigating kappa. We use the SGBDSN data to check some of the original assumptions, improve the statistical confidence of the conclusions, and determine values of kappa for stations on or near Yucca Mountain. The outstanding issues in kappa analysis, as they apply to Yucca Mountain, include: 1. The number itself. The kappa estimate near 20 msec from Su et al. (1996) and Anderson and Su (MOL.20071203.0134) is markedly smaller than is considered typical in California (Silva, 1995). The low kappa value has engineering consequences because when it is applied in ground motions analyses used in PSHA, it contributes to the extreme values of peak ground acceleration that the PSHA predicts. Also, in some areas precarious rock evidence indicates that no such accelerations have occurred. 2. The disagreement among analyses in the value of kappa. Previous reports indicate that smallest earthquakes yield kappa estimate 12-20 msec larger than average values from M3 to M4.5 aftershocks of Little Skull Mountain earthquake. 3. The source of kappa. Classically and in engineering usage, kappa is attributed largely to the upper tens or hundreds of meters at the recording site. However, borehole recordings imply that a significant contribution to kappa originates below several hundred meters depth. Also, when earthquakes are considered from a small source region, a true site effect should be common to all recordings. In fact kappa observations of LSM aftershocks to stations on Yucca Mountain and at network stations appear to vary greatly, as though much of kappa actually derives from near the seismic source. 4. The repository overburden contribution to kappa. PSHA estimated ground motions to a free surface at 300 meters depth with properties of confined rock at that depth. Rock mechanical and borehole estimates suggest that several milliseconds of the total kappa accrue between 300 meters to the surface. If estimates of kappa are small at the surface, little is left to reduce incident ground accelerations from the seismic source to the repository level. 5. The variability of kappa. In most cases parametric estimates of kappa have some range of values that fit the data equally well in a statistical sense, so errors in kappa estimates must be addressed. As noted above, kappa at a station also varies significantly for events from the same source area. 6. Are kappa values from small to moderate magnitude earthquakes appropriately applied to the larger, potentially damaging earthquakes of engineering concern? Put another way, is there a significant magnitude dependence in kappa? Questions 1 and 6 are of primary importance, but we find answers to several others in the course of our study. Data from southern Nevada are capable of resolving only some of these questions. In a global search, we identified data from the Japanese borehole accelerometer array KiKNet as most likely to address the questions of the shallow site structural contribution to kappa and the usefulness of moderate earthquake kappa estimates to predict strong ground motion.« less
Converting Advances in Seismology into Earthquake Science
NASA Astrophysics Data System (ADS)
Hauksson, Egill; Shearer, Peter; Vidale, John
2004-01-01
Federal and state agencies and university groups all operate seismic networks in California. The U.S. Geological Survey (USGS) operates seismic networks in California in cooperation with the California Institute of Technology (Caltech) in southern California, and the University of California (UC) at Berkeley in northern California. The California Geological Survey (CGS) and the USGS National Strong Motion Program (NSMP) operate dial-out strong motion instruments in the state, primarily to capture data from large earthquakes for earthquake engineering and, more recently, emergency response. The California Governor's Office of Emergency Services (OES) provides leadership for the most recent project, the California Integrated Seismic Network (CISN), to integrate all of the California efforts, and to take advantage of the emergency response capabilities of the seismic networks. The core members of the CISN are Caltech, UC Berkeley, CGS, USGS Menlo Park, and USGS Pasadena (http://www.cisn.org). New seismic instrumentation is in place across southern California, and significant progress has been made in improving instrumentation in northern California. Since 2001, these new field instrumentation efforts, data sharing, and software development for real-time reporting and archiving have been coordinated through the California Integrated Seismic Network (CISN). The CISN is also the California region of the Advanced National Seismic Network (ANSS). In addition, EarthScope deployments of USArray that will begin in early 2004 in California are coordinated with the CISN. The southern and northern California earthquake data centers (SCEDC and NCEDC) have new capabilities that enable seismologists to obtain large volumes of data with only modest effort.
Modeling the Fluid Withdraw and Injection Induced Earthquakes
NASA Astrophysics Data System (ADS)
Meng, C.
2016-12-01
We present an open source numerical code, Defmod, that allows one to model the induced seismicity in an efficient and standalone manner. The fluid withdraw and injection induced earthquake has been a great concern to the industries including oil/gas, wastewater disposal and CO2 sequestration. Being able to numerically model the induced seismicity is long desired. To do that, one has to consider at lease two processes, a steady process that describes the inducing and aseismic stages before and in between the seismic events, and an abrupt process that describes the dynamic fault rupture accompanied by seismic energy radiations during the events. The steady process can be adequately modeled by a quasi-static model, while the abrupt process has to be modeled by a dynamic model. In most of the published modeling works, only one of these processes is considered. The geomechanicists and reservoir engineers are focused more on the quasi-static modeling, whereas the geophysicists and seismologists are focused more on the dynamic modeling. The finite element code Defmod combines these two models into a hybrid model that uses the failure criterion and frictional laws to adaptively switch between the (quasi-)static and dynamic states. The code is capable of modeling episodic fault rupture driven by quasi-static loading, e.g. due to reservoir fluid withdraw and/or injection, and by dynamic loading, e.g. due to the foregoing earthquakes. We demonstrate a case study for the 2013 Azle earthquake.
Celebi, M.
2006-01-01
This paper introduces the state of the art, real-time and broad-band seismic monitoring network implemented for the 1206 m [3956 ft] long, cable-stayed Bill Emerson Memorial Bridge in Cape Girardeau (MO), a new Mississippi River crossing, approximately 80 km from the epicentral region of the 1811-1812 New Madrid earthquakes. The bridge was designed for a strong earthquake (magnitude 7.5 or greater) during the design life of the bridge. The monitoring network comprises a total of 84 channels of accelerometers deployed on the superstructure, pier foundations and at surface and downhole free-field arrays of the bridge. The paper also presents the high quality response data obtained from the network. Such data is aimed to be used by the owner, researchers and engineers to assess the performance of the bridge, to check design parameters, including the comparison of dynamic characteristics with actual response, and to better design future similar bridges. Preliminary analyses of ambient and low amplitude small earthquake data reveal specific response characteristics of the bridge and the free-field. There is evidence of coherent tower, cable, deck interaction that sometimes results in amplified ambient motions. Motions at the lowest tri-axial downhole accelerometers on both MO and IL sides are practically free from any feedback from the bridge. Motions at the mid-level and surface downhole accelerometers are influenced significantly by feedback due to amplified ambient motions of the bridge. Copyright ASCE 2006.