Important Earthquake Engineering Resources
PEER logo Pacific Earthquake Engineering Research Center home about peer news events research Engineering Resources Site Map Search Important Earthquake Engineering Resources - American Concrete Institute Motion Observation Systems (COSMOS) - Consortium of Universities for Research in Earthquake Engineering
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2012 CFR
2012-01-01
... 10 Energy 1 2012-01-01 2012-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2013 CFR
2013-01-01
... 10 Energy 1 2013-01-01 2013-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2011 CFR
2011-01-01
... 10 Energy 1 2011-01-01 2011-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
10 CFR Appendix S to Part 50 - Earthquake Engineering Criteria for Nuclear Power Plants
Code of Federal Regulations, 2014 CFR
2014-01-01
... 10 Energy 1 2014-01-01 2014-01-01 false Earthquake Engineering Criteria for Nuclear Power Plants S... FACILITIES Pt. 50, App. S Appendix S to Part 50—Earthquake Engineering Criteria for Nuclear Power Plants... applicant or holder whose construction permit was issued before January 10, 1997, the earthquake engineering...
MCEER, from Earthquake Engineering to Extreme Events | Home Page
Center Report Series Education Education Home Bridge Engineering Guest Speaker Series Connecte²d Teaching CSEE Graduate Student Poster Competition Earthquake Engineering Education in Haiti Earthquakes : FAQ's Engineering Seminar Series K-12 STEM Education National Engineers Week Research Experiences for
Real-time earthquake monitoring using a search engine method.
Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong
2014-12-04
When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake's parameters in <1 s after receiving the long-period surface wave data.
Real-time earthquake monitoring using a search engine method
Zhang, Jie; Zhang, Haijiang; Chen, Enhong; Zheng, Yi; Kuang, Wenhuan; Zhang, Xiong
2014-01-01
When an earthquake occurs, seismologists want to use recorded seismograms to infer its location, magnitude and source-focal mechanism as quickly as possible. If such information could be determined immediately, timely evacuations and emergency actions could be undertaken to mitigate earthquake damage. Current advanced methods can report the initial location and magnitude of an earthquake within a few seconds, but estimating the source-focal mechanism may require minutes to hours. Here we present an earthquake search engine, similar to a web search engine, that we developed by applying a computer fast search method to a large seismogram database to find waveforms that best fit the input data. Our method is several thousand times faster than an exact search. For an Mw 5.9 earthquake on 8 March 2012 in Xinjiang, China, the search engine can infer the earthquake’s parameters in <1 s after receiving the long-period surface wave data. PMID:25472861
Transportations Systems Modeling and Applications in Earthquake Engineering
2010-07-01
49 Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g)............... 50...Memphis, Tennessee. The NMSZ was responsible for the devastating 1811-1812 New Madrid earthquakes , the largest earthquakes ever recorded in the...Figure 6 PGA map of a M7.7 earthquake on all three New Madrid fault segments (g) Table 1 Fragility parameters for MSC steel bridge (Padgett 2007
NASA Astrophysics Data System (ADS)
Baytiyeh, Hoda; Naja, Mohamad K.
2014-09-01
Due to the high market demands for professional engineers in the Arab oil-producing countries, the appetite of Middle Eastern students for high-paying jobs and challenging careers in engineering has sharply increased. As a result, engineering programmes are providing opportunities for more students to enrol on engineering courses through lenient admission policies that do not compromise academic standards. This strategy has generated an influx of students who must be carefully educated to enhance their professional knowledge and social capital to assist in future earthquake-disaster risk-reduction efforts. However, the majority of Middle Eastern engineering students are unaware of the valuable acquired engineering skills and knowledge in building the resilience of their communities to earthquake disasters. As the majority of the countries in the Middle East are exposed to seismic hazards and are vulnerable to destructive earthquakes, engineers have become indispensable assets and the first line of defence against earthquake threats. This article highlights the contributions of some of the engineering innovations in advancing technologies and techniques for effective disaster mitigation and it calls for the incorporation of earthquake-disaster-mitigation education into academic engineering programmes in the Eastern Mediterranean region.
Research in seismology and earthquake engineering in Venezuela
Urbina, L.; Grases, J.
1983-01-01
After the July 29, 1967, damaging earthquake (with a moderate magnitude of 6.3) caused widespread damage to the northern coastal area of Venezuela and to the Caracas Valley, the Venezuelan Government decided to establish a Presidential Earthquake Commission. This commission undertook the task of coordinating the efforts to study the after-effects of the earthquake. The July 1967 earthquake claimed numerous lives and caused extensive damage to the capital of Venezuela. In 1968, the U.S Geological Survey conducted a seismological field study in the northern coastal area and in the Caracas Valley of Venezuela. the objective was to study the area that sustained severe, moderate, and no damage to structures. A reported entitled Ground Amplification Studies in Earthquake Damage Areas: The Caracas Earthquake of 1967 documented, for the first time, short-period seismic wave ground-motion amplifications in the Caracas Valley. Figure 1 shows the area of severe damage in the Los Palos Grantes suburb and the correlation with depth of alluvium and the arabic numbers denote the ground amplification factor at each site in the area. the Venezuelan Government initiated many programs to study in detail the damage sustained and to investigate the ongoing construction practices. These actions motivated professionals in the academic, private, and Government sectors to develops further capabilities and self-sufficiency in the fields of engineering and seismology. Allocation of funds was made to assist in training professionals and technicians and in developing new seismological stations and new programs at the national level in earthquake engineering and seismology. A brief description of the ongoing programs in Venezuela is listed below. these programs are being performed by FUNVISIS and by other national organizations listed at the end of this article.
Introduction: seismology and earthquake engineering in Mexico and Central and South America.
Espinosa, A.F.
1982-01-01
The results from seismological studies that are used by the engineering community are just one of the benefits obtained from research aimed at mitigating the earthquake hazard. In this issue of Earthquake Information Bulletin current programs in seismology and earthquake engineering, seismic networks, future plans and some of the cooperative programs with different internation organizations are described by Latin-American seismologists. The article describes the development of seismology in Latin America and the seismological interest of the OAS. -P.N.Chroston
Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.
1983-09-01
research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis
NASA Astrophysics Data System (ADS)
Wu, Stephen
Earthquake early warning (EEW) systems have been rapidly developing over the past decade. Japan Meteorological Agency (JMA) has an EEW system that was operating during the 2011 M9 Tohoku earthquake in Japan, and this increased the awareness of EEW systems around the world. While longer-time earthquake prediction still faces many challenges to be practical, the availability of shorter-time EEW opens up a new door for earthquake loss mitigation. After an earthquake fault begins rupturing, an EEW system utilizes the first few seconds of recorded seismic waveform data to quickly predict the hypocenter location, magnitude, origin time and the expected shaking intensity level around the region. This early warning information is broadcast to different sites before the strong shaking arrives. The warning lead time of such a system is short, typically a few seconds to a minute or so, and the information is uncertain. These factors limit human intervention to activate mitigation actions and this must be addressed for engineering applications of EEW. This study applies a Bayesian probabilistic approach along with machine learning techniques and decision theories from economics to improve different aspects of EEW operation, including extending it to engineering applications. Existing EEW systems are often based on a deterministic approach. Often, they assume that only a single event occurs within a short period of time, which led to many false alarms after the Tohoku earthquake in Japan. This study develops a probability-based EEW algorithm based on an existing deterministic model to extend the EEW system to the case of concurrent events, which are often observed during the aftershock sequence after a large earthquake. To overcome the challenge of uncertain information and short lead time of EEW, this study also develops an earthquake probability-based automated decision-making (ePAD) framework to make robust decision for EEW mitigation applications. A cost-benefit model that
NASA Astrophysics Data System (ADS)
Karimzadeh, Shaghayegh; Askan, Aysegul; Yakut, Ahmet
2017-09-01
Simulated ground motions can be used in structural and earthquake engineering practice as an alternative to or to augment the real ground motion data sets. Common engineering applications of simulated motions are linear and nonlinear time history analyses of building structures, where full acceleration records are necessary. Before using simulated ground motions in such applications, it is important to assess those in terms of their frequency and amplitude content as well as their match with the corresponding real records. In this study, a framework is outlined for assessment of simulated ground motions in terms of their use in structural engineering. Misfit criteria are determined for both ground motion parameters and structural response by comparing the simulated values against the corresponding real values. For this purpose, as a case study, the 12 November 1999 Duzce earthquake is simulated using stochastic finite-fault methodology. Simulated records are employed for time history analyses of frame models of typical residential buildings. Next, the relationships between ground motion misfits and structural response misfits are studied. Results show that the seismological misfits around the fundamental period of selected buildings determine the accuracy of the simulated responses in terms of their agreement with the observed responses.
Welcome to Pacific Earthquake Engineering Research Center - PEER
Triggering and Effects at Silty Soil Sites" - PEER Research Project Highlight: "Dissipative Base ; Upcoming Events More June 10-13, 2018 Geotechnical Earthquake Engineering and Soil Dynamics V 2018 - Call
A smartphone application for earthquakes that matter!
NASA Astrophysics Data System (ADS)
Bossu, Rémy; Etivant, Caroline; Roussel, Fréderic; Mazet-Roux, Gilles; Steed, Robert
2014-05-01
Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public, some of them having been downloaded more than 1 million times! The advantages are obvious: wherever someone's own location is, they can be automatically informed when an earthquake has struck. Just by setting a magnitude threshold and an area of interest, there is no longer the need to browse the internet as the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? What are the earthquakes that really matters to laypeople? One clue may be derived from some newspaper reports that show that a while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones that matter the most for the public (and authorities). They are the ones of societal importance even when of small magnitude. A smartphone application developed by EMSC (Euro-Med Seismological Centre) with the financial support of the Fondation MAIF aims at providing suitable notifications for earthquakes by collating different information threads covering tsunamigenic, potentially damaging and felt earthquakes. Tsunamigenic earthquakes are considered here to be those ones that are the subject of alert or information messages from the PTWC (Pacific Tsunami Warning Centre). While potentially damaging earthquakes are identified through an automated system called EQIA (Earthquake Qualitative Impact Assessment) developed and operated at EMSC. This rapidly assesses earthquake impact by comparing the population exposed to each expected
U.S. Geological Survey (USGS) Earthquake Web Applications
NASA Astrophysics Data System (ADS)
Fee, J.; Martinez, E.
2015-12-01
USGS Earthquake web applications provide access to earthquake information from USGS and other Advanced National Seismic System (ANSS) contributors. One of the primary goals of these applications is to provide a consistent experience for accessing both near-real time information as soon as it is available and historic information after it is thoroughly reviewed. Millions of people use these applications every month including people who feel an earthquake, emergency responders looking for the latest information about a recent event, and scientists researching historic earthquakes and their effects. Information from multiple catalogs and contributors is combined by the ANSS Comprehensive Catalog into one composite catalog, identifying the most preferred information from any source for each event. A web service and near-real time feeds provide access to all contributed data, and are used by a number of users and software packages. The Latest Earthquakes application displays summaries of many events, either near-real time feeds or custom searches, and the Event Page application shows detailed information for each event. Because all data is accessed through the web service, it can also be downloaded by users. The applications are maintained as open source projects on github, and use mobile-first and responsive-web-design approaches to work well on both mobile devices and desktop computers. http://earthquake.usgs.gov/earthquakes/map/
GeoMO 2008--geotechnical earthquake engineering : site response.
DOT National Transportation Integrated Search
2008-10-01
The theme of GeoMO2008 has recently become of more interest to the Midwest civil engineering community due to the perceived earthquake risks and new code requirements. The constant seismic reminder for the New Madrid Seismic Zone and new USGS hazard ...
De La Flor, Grace; Ojaghi, Mobin; Martínez, Ignacio Lamata; Jirotka, Marina; Williams, Martin S; Blakeborough, Anthony
2010-09-13
When transitioning local laboratory practices into distributed environments, the interdependent relationship between experimental procedure and the technologies used to execute experiments becomes highly visible and a focal point for system requirements. We present an analysis of ways in which this reciprocal relationship is reconfiguring laboratory practices in earthquake engineering as a new computing infrastructure is embedded within three laboratories in order to facilitate the execution of shared experiments across geographically distributed sites. The system has been developed as part of the UK Network for Earthquake Engineering Simulation e-Research project, which links together three earthquake engineering laboratories at the universities of Bristol, Cambridge and Oxford. We consider the ways in which researchers have successfully adapted their local laboratory practices through the modification of experimental procedure so that they may meet the challenges of coordinating distributed earthquake experiments.
The Lice, Turkey, earthquake of September 6, 1975; a preliminary engineering investigation
Yanev, P. I.
1976-01-01
The Fifth European Conference on Earthquake Engineering was held on September 22 through 25 in Istanbul, Turkey. The opening speech by the Honorable H. E. Nurettin Ok, Minister of Reconstruction and Resettlement of Turkey, introduced the several hundred delegates to the realities of earthquake hazards in Turkey:
NGA West 2 | Pacific Earthquake Engineering Research Center
, multi-year research program to improve Next Generation Attenuation models for active tectonic regions earthquake engineering, including modeling of directivity and directionality; verification of NGA-West models epistemic uncertainty; and evaluation of soil amplification factors in NGA models versus NEHRP site factors
EU H2020 SERA: Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe
NASA Astrophysics Data System (ADS)
Giardini, Domenico; Saleh, Kauzar; SERA Consortium, the
2017-04-01
SERA - Seismology and Earthquake Engineering Research Infrastructure Alliance for Europe - is a new infrastructure project awarded in the last Horizon 2020 call for Integrating Activities for Advanced Communities (INFRAIA-01-2016-2017). Building up on precursor projects like NERA, SHARE, NERIES, SERIES, etc., SERA is expected to contribute significantly to the access of data, services and research infrastructures, and to develop innovative solutions in seismology and earthquake engineering, with the overall objective of reducing the exposure to risks associated to natural and anthropogenic earthquakes. For instance, SERA will revise the European Seismic Hazard reference model for input in the current revision of the Eurocode 8 on Seismic Design of Buildings; we also foresee to develop the first comprehensive framework for seismic risk modeling at European scale, and to develop new standards for future experimental observations and instruments for earthquake engineering and seismology. To that aim, SERA is engaging 31 institutions across Europe with leading expertise in the operation of research facilities, monitoring infrastructures, data repositories and experimental facilities in the fields of seismology, anthropogenic hazards and earthquake engineering. SERA comprises 26 activities, including 5 Networking Activities (NA) to improve the availability and access of data through enhanced community coordination and pooling of resources, 6 Joint Research Activities (JRA) aimed at creating new European standards for the optimal use of the data collected by the European infrastructures, Virtual Access (VA) to the 5 main European services for seismology and engineering seismology, and Trans-national Access (TA) to 10 high-class experimental facilities for earthquake engineering and seismology in Europe. In fact, around 50% of the SERA resources will be dedicated to virtual and transnational access. SERA and EPOS (European Platform Observing System, a European Research
Engineering applications of strong ground motion simulation
NASA Astrophysics Data System (ADS)
Somerville, Paul
1993-02-01
The formulation, validation and application of a procedure for simulating strong ground motions for use in engineering practice are described. The procedure uses empirical source functions (derived from near-source strong motion recordings of small earthquakes) to provide a realistic representation of effects such as source radiation that are difficult to model at high frequencies due to their partly stochastic behavior. Wave propagation effects are modeled using simplified Green's functions that are designed to transfer empirical source functions from their recording sites to those required for use in simulations at a specific site. The procedure has been validated against strong motion recordings of both crustal and subduction earthquakes. For the validation process we choose earthquakes whose source models (including a spatially heterogeneous distribution of the slip of the fault) are independently known and which have abundant strong motion recordings. A quantitative measurement of the fit between the simulated and recorded motion in this validation process is used to estimate the modeling and random uncertainty associated with the simulation procedure. This modeling and random uncertainty is one part of the overall uncertainty in estimates of ground motions of future earthquakes at a specific site derived using the simulation procedure. The other contribution to uncertainty is that due to uncertainty in the source parameters of future earthquakes that affect the site, which is estimated from a suite of simulations generated by varying the source parameters over their ranges of uncertainty. In this paper, we describe the validation of the simulation procedure for crustal earthquakes against strong motion recordings of the 1989 Loma Prieta, California, earthquake, and for subduction earthquakes against the 1985 Michoacán, Mexico, and Valparaiso, Chile, earthquakes. We then show examples of the application of the simulation procedure to the estimatation of the
NASA Astrophysics Data System (ADS)
Li, Zongchao; Chen, Xueliang; Gao, Mengtan; Jiang, Han; Li, Tiefei
2017-03-01
Earthquake engineering parameters are very important in the engineering field, especially engineering anti-seismic design and earthquake disaster prevention. In this study, we focus on simulating earthquake engineering parameters by the empirical Green's function method. The simulated earthquake (MJMA6.5) occurred in Kyushu, Japan, 1997. Horizontal ground motion is separated as fault parallel and fault normal, in order to assess characteristics of two new direction components. Broadband frequency range of ground motion simulation is from 0.1 to 20 Hz. Through comparing observed parameters and synthetic parameters, we analyzed distribution characteristics of earthquake engineering parameters. From the comparison, the simulated waveform has high similarity with the observed waveform. We found the following. (1) Near-field PGA attenuates radically all around with strip radiation patterns in fault parallel while radiation patterns of fault normal is circular; PGV has a good similarity between observed record and synthetic record, but has different distribution characteristic in different components. (2) Rupture direction and terrain have a large influence on 90 % significant duration. (3) Arias Intensity is attenuating with increasing epicenter distance. Observed values have a high similarity with synthetic values. (4) Predominant period is very different in the part of Kyushu in fault normal. It is affected greatly by site conditions. (5) Most parameters have good reference values where the hypo-central is less than 35 km. (6) The GOF values of all these parameters are generally higher than 45 which means a good result according to Olsen's classification criterion. Not all parameters can fit well. Given these synthetic ground motion parameters, seismic hazard analysis can be performed and earthquake disaster analysis can be conducted in future urban planning.
PEER - National Information Service for Earthquake Engineering - NISEE
Information Service for Earthquake Engineering - NISEE The NISEE/PEER Library is an affiliated library of the Field Station, five miles from the main Berkeley campus. Address NISEE/PEER Library, University of Regents Hours Monday - Friday, 9:00 am - 1:00 pm Open to the public NISEE/PEER Library home page. 325
ERIC Educational Resources Information Center
English, Lyn D.; King, Donna; Smeed, Joanna
2017-01-01
As part of a 3-year longitudinal study, 136 sixth-grade students completed an engineering-based problem on earthquakes involving integrated STEM learning. Students employed engineering design processes and STEM disciplinary knowledge to plan, sketch, then construct a building designed to withstand earthquake damage, taking into account a number of…
NASA Astrophysics Data System (ADS)
Pioldi, Fabio; Rizzi, Egidio
2017-07-01
Output-only structural identification is developed by a refined Frequency Domain Decomposition ( rFDD) approach, towards assessing current modal properties of heavy-damped buildings (in terms of identification challenge), under strong ground motions. Structural responses from earthquake excitations are taken as input signals for the identification algorithm. A new dedicated computational procedure, based on coupled Chebyshev Type II bandpass filters, is outlined for the effective estimation of natural frequencies, mode shapes and modal damping ratios. The identification technique is also coupled with a Gabor Wavelet Transform, resulting in an effective and self-contained time-frequency analysis framework. Simulated response signals generated by shear-type frames (with variable structural features) are used as a necessary validation condition. In this context use is made of a complete set of seismic records taken from the FEMA P695 database, i.e. all 44 "Far-Field" (22 NS, 22 WE) earthquake signals. The modal estimates are statistically compared to their target values, proving the accuracy of the developed algorithm in providing prompt and accurate estimates of all current strong ground motion modal parameters. At this stage, such analysis tool may be employed for convenient application in the realm of Earthquake Engineering, towards potential Structural Health Monitoring and damage detection purposes.
Application of τc*Pd in earthquake early warning
NASA Astrophysics Data System (ADS)
Huang, Po-Lun; Lin, Ting-Li; Wu, Yih-Min
2015-03-01
Rapid assessment of damage potential and size of an earthquake at the station is highly demanded for onsite earthquake early warning. We study the application of τc*Pd for its estimation on the earthquake size using 123 events recorded by the borehole stations of KiK-net in Japan. The new type of earthquake size determined by τc*Pd is more related to the damage potential. We find that τc*Pd provides another parameter to measure the size of earthquake and the threshold to warn strong ground motion.
Principles for selecting earthquake motions in engineering design of large dams
Krinitzsky, E.L.; Marcuson, William F.
1983-01-01
This report gives a synopsis of the various tools and techniques used in selecting earthquake ground motion parameters for large dams. It presents 18 charts giving newly developed relations for acceleration, velocity, and duration versus site earthquake intensity for near- and far-field hard and soft sites and earthquakes having magnitudes above and below 7. The material for this report is based on procedures developed at the Waterways Experiment Station. Although these procedures are suggested primarily for large dams, they may also be applicable for other facilities. Because no standard procedure exists for selecting earthquake motions in engineering design of large dams, a number of precautions are presented to guide users. The selection of earthquake motions is dependent on which one of two types of engineering analyses are performed. A pseudostatic analysis uses a coefficient usually obtained from an appropriate contour map; whereas, a dynamic analysis uses either accelerograms assigned to a site or specified respunse spectra. Each type of analysis requires significantly different input motions. All selections of design motions must allow for the lack of representative strong motion records, especially near-field motions from earthquakes of magnitude 7 and greater, as well as an enormous spread in the available data. Limited data must be projected and its spread bracketed in order to fill in the gaps and to assure that there will be no surprises. Because each site may have differing special characteristics in its geology, seismic history, attenuation, recurrence, interpreted maximum events, etc., as integrated approach gives best results. Each part of the site investigation requires a number of decisions. In some cases, the decision to use a 'least ork' approach may be suitable, simply assuming the worst of several possibilities and testing for it. Because there are no standard procedures to follow, multiple approaches are useful. For example, peak motions at
Introduction: seismology and earthquake engineering in Central and South America.
Espinosa, A.F.
1983-01-01
Reports the state-of-the-art in seismology and earthquake engineering that is being advanced in Central and South America. Provides basic information on seismological station locations in Latin America and some of the programmes in strong-motion seismology, as well as some of the organizations involved in these activities.-from Author
Neo-deterministic definition of earthquake hazard scenarios: a multiscale application to India
NASA Astrophysics Data System (ADS)
Peresan, Antonella; Magrin, Andrea; Parvez, Imtiyaz A.; Rastogi, Bal K.; Vaccari, Franco; Cozzini, Stefano; Bisignano, Davide; Romanelli, Fabio; Panza, Giuliano F.; Ashish, Mr; Mir, Ramees R.
2014-05-01
performed to understand the influence of the model characteristics on the computed ground shaking scenarios. For massive parametric tests, or for the repeated generation of large scale hazard maps, the methodology can take advantage of more advanced computational platforms, ranging from GRID computing infrastructures to HPC dedicated clusters up to Cloud computing. In such a way, scientists can deal efficiently with the variety and complexity of the potential earthquake sources, and perform parametric studies to characterize the related uncertainties. NDSHA provides realistic time series of expected ground motion readily applicable for seismic engineering analysis and other mitigation actions. The methodology has been successfully applied to strategic buildings, lifelines and cultural heritage sites, and for the purpose of seismic microzoning in several urban areas worldwide. A web application is currently being developed that facilitates the access to the NDSHA methodology and the related outputs by end-users, who are interested in reliable territorial planning and in the design and construction of buildings and infrastructures in seismic areas. At the same, the web application is also shaping up as an advanced educational tool to explore interactively how seismic waves are generated at the source, propagate inside structural models, and build up ground shaking scenarios. We illustrate the preliminary results obtained from a multiscale application of NDSHA approach to the territory of India, zooming from large scale hazard maps of ground shaking at bedrock, to the definition of local scale earthquake scenarios for selected sites in the Gujarat state (NW India). The study aims to provide the community (e.g. authorities and engineers) with advanced information for earthquake risk mitigation, which is particularly relevant to Gujarat in view of the rapid development and urbanization of the region.
33 CFR 222.4 - Reporting earthquake effects.
Code of Federal Regulations, 2010 CFR
2010-07-01
..., DEPARTMENT OF DEFENSE ENGINEERING AND DESIGN § 222.4 Reporting earthquake effects. (a) Purpose. This... structural integrity and operational adequacy of major Civil Works structures following the occurrence of...) Applicability. This regulation is applicable to all field operating agencies having Civil Works responsibilities...
The Electronic Encyclopedia of Earthquakes
NASA Astrophysics Data System (ADS)
Benthien, M.; Marquis, J.; Jordan, T.
2003-12-01
The Electronic Encyclopedia of Earthquakes is a collaborative project of the Southern California Earthquake Center (SCEC), the Consortia of Universities for Research in Earthquake Engineering (CUREE) and the Incorporated Research Institutions for Seismology (IRIS). This digital library organizes earthquake information online as a partner with the NSF-funded National Science, Technology, Engineering and Mathematics (STEM) Digital Library (NSDL) and the Digital Library for Earth System Education (DLESE). When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. Although conceived primarily as an educational resource, the Encyclopedia is also a valuable portal to anyone seeking up-to-date earthquake information and authoritative technical sources. "E3" is a unique collaboration among earthquake scientists and engineers to articulate and document a common knowledge base with a shared terminology and conceptual framework. It is a platform for cross-training scientists and engineers in these complementary fields and will provide a basis for sustained communication and resource-building between major education and outreach activities. For example, the E3 collaborating organizations have leadership roles in the two largest earthquake engineering and earth science projects ever sponsored by NSF: the George E. Brown Network for Earthquake Engineering Simulation (CUREE) and the EarthScope Project (IRIS and SCEC). The E3 vocabulary and definitions are also being connected to a formal ontology under development by the SCEC/ITR project for knowledge management within the SCEC Collaboratory. The E3 development system is now fully operational, 165 entries are in the pipeline, and the development teams are capable of producing 20 new, fully reviewed encyclopedia entries each month. Over the next two years teams will
NASA Astrophysics Data System (ADS)
Filiatrault, Andre; Sullivan, Timothy
2014-08-01
With the development and implementation of performance-based earthquake engineering, harmonization of performance levels between structural and nonstructural components becomes vital. Even if the structural components of a building achieve a continuous or immediate occupancy performance level after a seismic event, failure of architectural, mechanical or electrical components can lower the performance level of the entire building system. This reduction in performance caused by the vulnerability of nonstructural components has been observed during recent earthquakes worldwide. Moreover, nonstructural damage has limited the functionality of critical facilities, such as hospitals, following major seismic events. The investment in nonstructural components and building contents is far greater than that of structural components and framing. Therefore, it is not surprising that in many past earthquakes, losses from damage to nonstructural components have exceeded losses from structural damage. Furthermore, the failure of nonstructural components can become a safety hazard or can hamper the safe movement of occupants evacuating buildings, or of rescue workers entering buildings. In comparison to structural components and systems, there is relatively limited information on the seismic design of nonstructural components. Basic research work in this area has been sparse, and the available codes and guidelines are usually, for the most part, based on past experiences, engineering judgment and intuition, rather than on objective experimental and analytical results. Often, design engineers are forced to start almost from square one after each earthquake event: to observe what went wrong and to try to prevent repetitions. This is a consequence of the empirical nature of current seismic regulations and guidelines for nonstructural components. This review paper summarizes current knowledge on the seismic design and analysis of nonstructural building components, identifying major
NASA Technical Reports Server (NTRS)
Scholl, R. E. (Editor)
1979-01-01
Earthquake engineering research capabilities of the National Aeronautics and Space Administration (NASA) facilities at George C. Marshall Space Flight Center (MSFC), Alabama, were evaluated. The results indicate that the NASA/MSFC facilities and supporting capabilities offer unique opportunities for conducting earthquake engineering research. Specific features that are particularly attractive for large scale static and dynamic testing of natural and man-made structures include the following: large physical dimensions of buildings and test bays; high loading capacity; wide range and large number of test equipment and instrumentation devices; multichannel data acquisition and processing systems; technical expertise for conducting large-scale static and dynamic testing; sophisticated techniques for systems dynamics analysis, simulation, and control; and capability for managing large-size and technologically complex programs. Potential uses of the facilities for near and long term test programs to supplement current earthquake research activities are suggested.
Engineering models for catastrophe risk and their application to insurance
NASA Astrophysics Data System (ADS)
Dong, Weimin
2002-06-01
Internationally earthquake insurance, like all other insurance (fire, auto), adopted actuarial approach in the past, which is, based on historical loss experience to determine insurance rate. Due to the fact that earthquake is a rare event with severe consequence, irrational determination of premium rate and lack of understanding scale of potential loss led to many insurance companies insolvent after Northridge earthquake in 1994. Along with recent advances in earth science, computer science and engineering, computerized loss estimation methodologies based on first principles have been developed to the point that losses from destructive earthquakes can be quantified with reasonable accuracy using scientific modeling techniques. This paper intends to introduce how engineering models can assist to quantify earthquake risk and how insurance industry can use this information to manage their risk in the United States and abroad.
The HayWired earthquake scenario—Engineering implications
Detweiler, Shane T.; Wein, Anne M.
2018-04-18
The HayWired Earthquake Scenario—Engineering Implications is the second volume of U.S. Geological Survey (USGS) Scientific Investigations Report 2017–5013, which describes the HayWired scenario, developed by USGS and its partners. The scenario is a hypothetical yet scientifically realistic earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after a magnitude-7 earthquake (mainshock) on the Hayward Fault and its aftershocks.Analyses in this volume suggest that (1) 800 deaths and 16,000 nonfatal injuries result from shaking alone, plus property and direct business interruption losses of more than $82 billion from shaking, liquefaction, and landslides; (2) the building code is designed to protect lives, but even if all buildings in the region complied with current building codes, 0.4 percent could collapse, 5 percent could be unsafe to occupy, and 19 percent could have restricted use; (3) people expect, prefer, and would be willing to pay for greater resilience of buildings; (4) more than 22,000 people could require extrication from stalled elevators, and more than 2,400 people could require rescue from collapsed buildings; (5) the average east-bay resident could lose water service for 6 weeks, some for as long as 6 months; (6) older steel-frame high-rise office buildings and new reinforced-concrete residential buildings in downtown San Francisco and Oakland could be unusable for as long as 10 months; (7) about 450 large fires could result in a loss of residential and commercial building floor area equivalent to more than 52,000 single-family homes and cause property (building and content) losses approaching $30 billion; and (8) combining earthquake early warning (ShakeAlert) with “drop, cover, and hold on” actions could prevent as many as 1,500 nonfatal injuries out of 18,000 total estimated nonfatal injuries from shaking and liquefaction hazards.
The next new Madrid earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Atkinson, W.
1988-01-01
Scientists who specialize in the study of Mississippi Valley earthquakes say that the region is overdue for a powerful tremor that will cause major damage and undoubtedly some casualties. The inevitability of a future quake and the lack of preparation by both individuals and communities provided the impetus for this book. It brings together applicable information from many disciplines: history, geology and seismology, engineering, zoology, politics and community planning, economics, environmental science, sociology, and psychology and mental health to provide a perspective of the myriad impacts of a major earthquake on the Mississippi Valley. The author addresses such basic questionsmore » as What, actually, are earthquakes How do they occur Can they be predicted, perhaps even prevented He also addresses those steps that individuals can take to improve their chances for survival both during and after an earthquake.« less
Rescaled earthquake recurrence time statistics: application to microrepeaters
NASA Astrophysics Data System (ADS)
Goltz, Christian; Turcotte, Donald L.; Abaimov, Sergey G.; Nadeau, Robert M.; Uchida, Naoki; Matsuzawa, Toru
2009-01-01
Slip on major faults primarily occurs during `characteristic' earthquakes. The recurrence statistics of characteristic earthquakes play an important role in seismic hazard assessment. A major problem in determining applicable statistics is the short sequences of characteristic earthquakes that are available worldwide. In this paper, we introduce a rescaling technique in which sequences can be superimposed to establish larger numbers of data points. We consider the Weibull and log-normal distributions, in both cases we rescale the data using means and standard deviations. We test our approach utilizing sequences of microrepeaters, micro-earthquakes which recur in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Microrepeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. In this paper, we present results for the analysis of recurrence times for several microrepeater sequences from Parkfield, CA as well as NE Japan. We find that, once the respective sequence can be considered to be of sufficient stationarity, the statistics can be well fitted by either a Weibull or a log-normal distribution. We clearly demonstrate this fact by our technique of rescaled combination. We conclude that the recurrence statistics of the microrepeater sequences we consider are similar to the recurrence statistics of characteristic earthquakes on major faults.
Engineering uses of physics-based ground motion simulations
Baker, Jack W.; Luco, Nicolas; Abrahamson, Norman A.; Graves, Robert W.; Maechling, Phillip J.; Olsen, Kim B.
2014-01-01
This paper summarizes validation methodologies focused on enabling ground motion simulations to be used with confidence in engineering applications such as seismic hazard analysis and dynmaic analysis of structural and geotechnical systems. Numberical simullation of ground motion from large erthquakes, utilizing physics-based models of earthquake rupture and wave propagation, is an area of active research in the earth science community. Refinement and validatoin of these models require collaboration between earthquake scientists and engineering users, and testing/rating methodolgies for simulated ground motions to be used with confidence in engineering applications. This paper provides an introduction to this field and an overview of current research activities being coordinated by the Souther California Earthquake Center (SCEC). These activities are related both to advancing the science and computational infrastructure needed to produce ground motion simulations, as well as to engineering validation procedures. Current research areas and anticipated future achievements are also discussed.
Designing an Earthquake-Proof Art Museum: An Arts- and Engineering-Integrated Science Lesson
ERIC Educational Resources Information Center
Carignan, Anastasia; Hussain, Mahjabeen
2016-01-01
In this practical arts-integrated science and engineering lesson, an inquiry-based approach was adopted to teach a class of fourth graders in a Midwest elementary school about the scientific concepts of plate tectonics and earthquakes. Lessons were prepared following the 5 E instructional model. Next Generation Science Standards (4-ESS3-2) and the…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hofmann, R.B.
1995-09-01
Analogs are used to understand complex or poorly understood phenomena for which little data may be available at the actual repository site. Earthquakes are complex phenomena, and they can have a large number of effects on the natural system, as well as on engineered structures. Instrumental data close to the source of large earthquakes are rarely obtained. The rare events for which measurements are available may be used, with modfications, as analogs for potential large earthquakes at sites where no earthquake data are available. In the following, several examples of nuclear reactor and liquified natural gas facility siting are discussed.more » A potential use of analog earthquakes is proposed for a high-level nuclear waste (HLW) repository.« less
Real-Time Earthquake Monitoring with Spatio-Temporal Fields
NASA Astrophysics Data System (ADS)
Whittier, J. C.; Nittel, S.; Subasinghe, I.
2017-10-01
With live streaming sensors and sensor networks, increasingly large numbers of individual sensors are deployed in physical space. Sensor data streams are a fundamentally novel mechanism to deliver observations to information systems. They enable us to represent spatio-temporal continuous phenomena such as radiation accidents, toxic plumes, or earthquakes almost as instantaneously as they happen in the real world. Sensor data streams discretely sample an earthquake, while the earthquake is continuous over space and time. Programmers attempting to integrate many streams to analyze earthquake activity and scope need to write code to integrate potentially very large sets of asynchronously sampled, concurrent streams in tedious application code. In previous work, we proposed the field stream data model (Liang et al., 2016) for data stream engines. Abstracting the stream of an individual sensor as a temporal field, the field represents the Earth's movement at the sensor position as continuous. This simplifies analysis across many sensors significantly. In this paper, we undertake a feasibility study of using the field stream model and the open source Data Stream Engine (DSE) Apache Spark(Apache Spark, 2017) to implement a real-time earthquake event detection with a subset of the 250 GPS sensor data streams of the Southern California Integrated GPS Network (SCIGN). The field-based real-time stream queries compute maximum displacement values over the latest query window of each stream, and related spatially neighboring streams to identify earthquake events and their extent. Further, we correlated the detected events with an USGS earthquake event feed. The query results are visualized in real-time.
NASA Astrophysics Data System (ADS)
Toke, N.; Johnson, A.; Nelson, K.
2010-12-01
Earthquakes are one of the most widely covered geologic processes by the media. As a result students, even at the middle school level, arrive in the classroom with preconceptions about the importance and hazards posed by earthquakes. Therefore earthquakes represent not only an attractive topic to engage students when introducing tectonics, but also a means to help students understand the relationships between geologic processes, society, and engineering solutions. Facilitating understanding of the fundamental connections between science and society is important for the preparation of future scientists and engineers as well as informed citizens. Here, we present a week-long lesson designed to be implemented in five one hour sessions with classes of ~30 students. It consists of two inquiry-based mapping investigations, motivational presentations, and short readings that describe fundamental models of plate tectonics, faults, and earthquakes. The readings also provide examples of engineering solutions such as the Alaskan oil pipeline which withstood multi-meter surface offset in the 2002 Denali Earthquake. The first inquiry-based investigation is a lesson on tectonic plates. Working in small groups, each group receives a different world map plotting both topography and one of the following data sets: GPS plate motion vectors, the locations and types of volcanoes, the location of types of earthquakes. Using these maps and an accompanying explanation of the data each group’s task is to map plate boundary locations. Each group then presents a ~10 minute summary of the type of data they used and their interpretation of the tectonic plates with a poster and their mapping results. Finally, the instructor will facilitate a class discussion about how the data types could be combined to understand more about plate boundaries. Using student interpretations of real data allows student misconceptions to become apparent. Throughout the exercise we record student preconceptions
Engineering geological aspect of Gorkha Earthquake 2015, Nepal
NASA Astrophysics Data System (ADS)
Adhikari, Basanta Raj; Andermann, Christoff; Cook, Kristen
2016-04-01
Strong shaking by earthquake causes massif landsliding with severe effects on infrastructure and human lives. The distribution of landslides and other hazards are depending on the combination of earthquake and local characteristics which influence the dynamic response of hillslopes. The Himalayas are one of the most active mountain belts with several kilometers of relief and is very prone to catastrophic mass failure. Strong and shallow earthquakes are very common and cause wide spread collapse of hillslopes, increasing the background landslide rate by several magnitude. The Himalaya is facing many small and large earthquakes in the past i.e. earthquakes i.e. Bihar-Nepal earthquake 1934 (Ms 8.2); Large Kangra earthquake of 1905 (Ms 7.8); Gorkha earthquake 2015 (Mw 7.8). The Mw 7.9 Gorkha earthquake has occurred on and around the main Himalayan Thrust with a hypocentral depth of 15 km (GEER 2015) followed by Mw 7.3 aftershock in Kodari causing 8700+ deaths and leaving hundreds of thousands of homeless. Most of the 3000 aftershocks located by National Seismological Center (NSC) within the first 45 days following the Gorkha Earthquake are concentrated in a narrow 40 km-wide band at midcrustal to shallow depth along the strike of the southern slope of the high Himalaya (Adhikari et al. 2015) and the ground shaking was substantially lower in the short-period range than would be expected for and earthquake of this magnitude (Moss et al. 2015). The effect of this earthquake is very unique in affected areas by showing topographic effect, liquefaction and land subsidence. More than 5000 landslides were triggered by this earthquake (Earthquake without Frontiers, 2015). Most of the landslides are shallow and occurred in weathered bedrock and appear to have mobilized primarily as raveling failures, rock slides and rock falls. Majority of landslides are limited to a zone which runs east-west, approximately parallel the lesser and higher Himalaya. There are numerous cracks in
Examining Science Teachers' Argumentation in a Teacher Workshop on Earthquake Engineering
NASA Astrophysics Data System (ADS)
Cavlazoglu, Baki; Stuessy, Carol
2018-02-01
The purpose of this study was to examine changes in the quality of science teachers' argumentation as a result of their engagement in a teacher workshop on earthquake engineering emphasizing distributed learning approaches, which included concept mapping, collaborative game playing, and group lesson planning. The participants were ten high school science teachers from US high schools who elected to attend the workshop. To begin and end the teacher workshop, teachers in small groups engaged in concept mapping exercises with other teachers. Researchers audio-recorded individual teachers' argumentative statements about the inclusion of earthquake engineering concepts in their concept maps, which were then analyzed to reveal the quality of teachers' argumentation. Toulmin's argumentation model formed the framework for designing a classification schema to analyze the quality of participants' argumentative statements. While the analysis of differences in pre- and post-workshop concept mapping exercises revealed that the number of argumentative statements did not change significantly, the quality of participants' argumentation did increase significantly. As these differences occurred concurrently with distributed learning approaches used throughout the workshop, these results provide evidence to support distributed learning approaches in professional development workshop activities to increase the quality of science teachers' argumentation. Additionally, these results support the use of concept mapping as a cognitive scaffold to organize participants' knowledge, facilitate the presentation of argumentation, and as a research tool for providing evidence of teachers' argumentation skills.
DOT National Transportation Integrated Search
1998-12-01
This manual was written to provide training on how to apply principles of geotechnical earthquake engineering to planning, design, and retrofit of highway facilities. Reproduced here are two chapters 4 and 8 in the settlement, respectively. These cha...
End-User Applications of Real-Time Earthquake Information in Europe
NASA Astrophysics Data System (ADS)
Cua, G. B.; Gasparini, P.; Giardini, D.; Zschau, J.; Filangieri, A. R.; Reakt Wp7 Team
2011-12-01
The primary objective of European FP7 project REAKT (Strategies and Tools for Real-Time Earthquake Risk Reduction) is to improve the efficiency of real-time earthquake risk mitigation methods and their capability of protecting structures, infrastructures, and populations. REAKT aims to address the issues of real-time earthquake hazard and response from end-to-end, with efforts directed along the full spectrum of methodology development in earthquake forecasting, earthquake early warning, and real-time vulnerability systems, through optimal decision-making, and engagement and cooperation of scientists and end users for the establishment of best practices for use of real-time information. Twelve strategic test cases/end users throughout Europe have been selected. This diverse group of applications/end users includes civil protection authorities, railway systems, hospitals, schools, industrial complexes, nuclear plants, lifeline systems, national seismic networks, and critical structures. The scale of target applications covers a wide range, from two school complexes in Naples, to individual critical structures, such as the Rion Antirion bridge in Patras, and the Fatih Sultan Mehmet bridge in Istanbul, to large complexes, such as the SINES industrial complex in Portugal and the Thessaloniki port area, to distributed lifeline and transportation networks and nuclear plants. Some end-users are interested in in-depth feasibility studies for use of real-time information and development of rapid response plans, while others intend to install real-time instrumentation and develop customized automated control systems. From the onset, REAKT scientists and end-users will work together on concept development and initial implementation efforts using the data products and decision-making methodologies developed with the goal of improving end-user risk mitigation. The aim of this scientific/end-user partnership is to ensure that scientific efforts are applicable to operational
Machine Learning Seismic Wave Discrimination: Application to Earthquake Early Warning
NASA Astrophysics Data System (ADS)
Li, Zefeng; Meier, Men-Andrin; Hauksson, Egill; Zhan, Zhongwen; Andrews, Jennifer
2018-05-01
Performance of earthquake early warning systems suffers from false alerts caused by local impulsive noise from natural or anthropogenic sources. To mitigate this problem, we train a generative adversarial network (GAN) to learn the characteristics of first-arrival earthquake P waves, using 300,000 waveforms recorded in southern California and Japan. We apply the GAN critic as an automatic feature extractor and train a Random Forest classifier with about 700,000 earthquake and noise waveforms. We show that the discriminator can recognize 99.2% of the earthquake P waves and 98.4% of the noise signals. This state-of-the-art performance is expected to reduce significantly the number of false triggers from local impulsive noise. Our study demonstrates that GANs can discover a compact and effective representation of seismic waves, which has the potential for wide applications in seismology.
1980-01-01
standard procedure for Analysis of all types of civil engineering struc- tures. Early in its development, it became apparent that this method had...unique potentialities in the evaluation of stress in dams, and many of its earliest civil engineering applications concerned special problems associated...with such structures [3,4]. The earliest dynamic finite element analyses of civil engineering structures involved the earthquake response analysis of
NASA Astrophysics Data System (ADS)
Irikura, Kojiro; Miyakoshi, Ken; Kamae, Katsuhiro; Yoshida, Kunikazu; Somei, Kazuhiro; Kurahashi, Susumu; Miyake, Hiroe
2017-01-01
A two-stage scaling relationship of the source parameters for crustal earthquakes in Japan has previously been constructed, in which source parameters obtained from the results of waveform inversion of strong motion data are combined with parameters estimated based on geological and geomorphological surveys. A three-stage scaling relationship was subsequently developed to extend scaling to crustal earthquakes with magnitudes greater than M w 7.4. The effectiveness of these scaling relationships was then examined based on the results of waveform inversion of 18 recent crustal earthquakes ( M w 5.4-6.9) that occurred in Japan since the 1995 Hyogo-ken Nanbu earthquake. The 2016 Kumamoto earthquake, with M w 7.0, was one of the largest earthquakes to occur since dense and accurate strong motion observation networks, such as K-NET and KiK-net, were deployed after the 1995 Hyogo-ken Nanbu earthquake. We examined the applicability of the scaling relationships of the source parameters of crustal earthquakes in Japan to the 2016 Kumamoto earthquake. The rupture area and asperity area were determined based on slip distributions obtained from waveform inversion of the 2016 Kumamoto earthquake observations. We found that the relationship between the rupture area and the seismic moment for the 2016 Kumamoto earthquake follows the second-stage scaling within one standard deviation ( σ = 0.14). The ratio of the asperity area to the rupture area for the 2016 Kumamoto earthquake is nearly the same as ratios previously obtained for crustal earthquakes. Furthermore, we simulated the ground motions of this earthquake using a characterized source model consisting of strong motion generation areas (SMGAs) based on the empirical Green's function (EGF) method. The locations and areas of the SMGAs were determined through comparison between the synthetic ground motions and observed motions. The sizes of the SMGAs were nearly coincident with the asperities with large slip. The synthetic
Organizational changes at Earthquakes & Volcanoes
Gordon, David W.
1992-01-01
Primary responsibility for the preparation of Earthquakes & Volcanoes within the Geological Survey has shifted from the Office of Scientific Publications to the Office of Earthquakes, Volcanoes, and Engineering (OEVE). As a consequence of this reorganization, Henry Spall has stepepd down as Science Editor for Earthquakes & Volcanoes(E&V).
Stirling engine application study
NASA Technical Reports Server (NTRS)
Teagan, W. P.; Cunningham, D.
1983-01-01
A range of potential applications for Stirling engines in the power range from 0.5 to 5000 hp is surveyed. Over one hundred such engine applications are grouped into a small number of classes (10), with the application in each class having a high degree of commonality in technical performance and cost requirements. A review of conventional engines (usually spark ignition or Diesel) was then undertaken to determine the degree to which commercial engine practice now serves the needs of the application classes and to detemine the nature of the competition faced by a new engine system. In each application class the Stirling engine was compared to the conventional engines, assuming that objectives of ongoing Stirling engine development programs are met. This ranking process indicated that Stirling engines showed potential for use in all application classes except very light duty applications (lawn mowers, etc.). However, this potential is contingent on demonstrating much greater operating life and reliability than has been demonstrated to date by developmental Stirling engine systems. This implies that future program initiatives in developing Stirling engine systems should give more emphasis to life and reliability issues than has been the case in ongoing programs.
Turkish Compulsory Earthquake Insurance (TCIP)
NASA Astrophysics Data System (ADS)
Erdik, M.; Durukal, E.; Sesetyan, K.
2009-04-01
Through a World Bank project a government-sponsored Turkish Catastrophic Insurance Pool (TCIP) is created in 2000 with the essential aim of transferring the government's financial burden of replacing earthquake-damaged housing to international reinsurance and capital markets. Providing coverage to about 2.9 Million homeowners TCIP is the largest insurance program in the country with about 0.5 Billion USD in its own reserves and about 2.3 Billion USD in total claims paying capacity. The total payment for earthquake damage since 2000 (mostly small, 226 earthquakes) amounts to about 13 Million USD. The country-wide penetration rate is about 22%, highest in the Marmara region (30%) and lowest in the south-east Turkey (9%). TCIP is the sole-source provider of earthquake loss coverage up to 90,000 USD per house. The annual premium, categorized on the basis of earthquake zones type of structure, is about US90 for a 100 square meter reinforced concrete building in the most hazardous zone with 2% deductible. The earthquake engineering related shortcomings of the TCIP is exemplified by fact that the average rate of 0.13% (for reinforced concrete buildings) with only 2% deductible is rather low compared to countries with similar earthquake exposure. From an earthquake engineering point of view the risk underwriting (Typification of housing units to be insured, earthquake intensity zonation and the sum insured) of the TCIP needs to be overhauled. Especially for large cities, models can be developed where its expected earthquake performance (and consequently the insurance premium) can be can be assessed on the basis of the location of the unit (microzoned earthquake hazard) and basic structural attributes (earthquake vulnerability relationships). With such an approach, in the future the TCIP can contribute to the control of construction through differentiation of premia on the basis of earthquake vulnerability.
Housing Damage Following Earthquake
NASA Technical Reports Server (NTRS)
1989-01-01
An automobile lies crushed under the third story of this apartment building in the Marina District after the Oct. 17, 1989, Loma Prieta earthquake. The ground levels are no longer visible because of structural failure and sinking due to liquefaction. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: J.K. Nakata, U.S. Geological Survey.
Road Damage Following Earthquake
NASA Technical Reports Server (NTRS)
1989-01-01
Ground shaking triggered liquefaction in a subsurface layer of water-saturated sand, producing differential lateral and vertical movement in a overlying carapace of unliquified sand and slit, which moved from right to left towards the Pajaro River. This mode of ground failure, termed lateral spreading, is a principal cause of liquefaction-related earthquake damage caused by the Oct. 17, 1989, Loma Prieta earthquake. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditons that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. Credit: S.D. Ellen, U.S. Geological Survey
NASA Astrophysics Data System (ADS)
Mualchin, Lalliana
2011-03-01
results at that time. CDMG eventually published the second edition map in 1992 following the Governor's Board of Inquiry on the 1989 Loma Prieta earthquake and at the demand of Caltrans. The third edition map was published by Caltrans in 1996 utilizing GIS technology to manage data that includes a simplified three-dimension geometry of faults and to facilitate efficient corrections and revisions of data and the map. The spatial relationship of fault hazards with highways, bridges or any other attribute can be efficiently managed and analyzed now in GIS at Caltrans. There has been great confidence in using DSHA in bridge engineering and other applications in California, and it can be confidently applied in any other earthquake-prone region. Earthquake hazards defined by DSHA are: (1) transparent and stable with robust MCE moment magnitudes; (2) flexible in their application to design considerations; (3) can easily incorporate advances in ground motion simulations; and (4) economical. DSHA and neo-DSHA have the same approach and applicability. The accuracy of DSHA has proven to be quite reasonable for practical applications within engineering design and always done with professional judgment. In the final analysis, DSHA is a reality-check for public safety and PSHA results. Although PSHA has been acclaimed as a better approach for seismic hazard assessment, it is DSHA, not PSHA, that has actually been used in seismic hazard assessment for building and bridge engineering, particularly in California.
Holzer, Thomas L.
1998-01-01
This chapter contains two papers that summarize the performance of engineered earth structures, dams and stabilized excavations in soil, and two papers that characterize for engineering purposes the attenuation of ground motion with distance during the Loma Prieta earthquake. Documenting the field performance of engineered structures and confirming empirically based predictions of ground motion are critical for safe and cost effective seismic design of future structures as well as the retrofitting of existing ones.
Living on an Active Earth: Perspectives on Earthquake Science
NASA Astrophysics Data System (ADS)
Lay, Thorne
2004-02-01
The annualized long-term loss due to earthquakes in the United States is now estimated at $4.4 billion per year. A repeat of the 1923 Kanto earthquake, near Tokyo, could cause direct losses of $2-3 trillion. With such grim numbers, which are guaranteed to make you take its work seriously, the NRC Committee on the Science of Earthquakes begins its overview of the emerging multidisciplinary field of earthquake science. An up-to-date and forward-looking survey of scientific investigation of earthquake phenomena and engineering response to associated hazards is presented at a suitable level for a general educated audience. Perspectives from the fields of seismology, geodesy, neo-tectonics, paleo-seismology, rock mechanics, earthquake engineering, and computer modeling of complex dynamic systems are integrated into a balanced definition of earthquake science that has never before been adequately articulated.
The Road to Total Earthquake Safety
NASA Astrophysics Data System (ADS)
Frohlich, Cliff
Cinna Lomnitz is possibly the most distinguished earthquake seismologist in all of Central and South America. Among many other credentials, Lomnitz has personally experienced the shaking and devastation that accompanied no fewer than five major earthquakes—Chile, 1939; Kern County, California, 1952; Chile, 1960; Caracas,Venezuela, 1967; and Mexico City, 1985. Thus he clearly has much to teach someone like myself, who has never even actually felt a real earthquake.What is this slim book? The Road to Total Earthquake Safety summarizes Lomnitz's May 1999 presentation at the Seventh Mallet-Milne Lecture, sponsored by the Society for Earthquake and Civil Engineering Dynamics. His arguments are motivated by the damage that occurred in three earthquakes—Mexico City, 1985; Loma Prieta, California, 1989; and Kobe, Japan, 1995. All three quakes occurred in regions where earthquakes are common. Yet in all three some of the worst damage occurred in structures located a significant distance from the epicenter and engineered specifically to resist earthquakes. Some of the damage also indicated that the structures failed because they had experienced considerable rotational or twisting motion. Clearly, Lomnitz argues, there must be fundamental flaws in the usually accepted models explaining how earthquakes generate strong motions, and how we should design resistant structures.
NASA Astrophysics Data System (ADS)
Rodgers, A. J.; Pitarka, A.; Petersson, N. A.; Sjogreen, B.; McCallen, D.; Miah, M.
2016-12-01
Simulation of earthquake ground motions is becoming more widely used due to improvements of numerical methods, development of ever more efficient computer programs (codes), and growth in and access to High-Performance Computing (HPC). We report on how SW4 can be used for accurate and efficient simulations of earthquake strong motions. SW4 is an anelastic finite difference code based on a fourth order summation-by-parts displacement formulation. It is parallelized and can run on one or many processors. SW4 has many desirable features for seismic strong motion simulation: incorporation of surface topography; automatic mesh generation; mesh refinement; attenuation and supergrid boundary conditions. It also has several ways to introduce 3D models and sources (including Standard Rupture Format for extended sources). We are using SW4 to simulate strong ground motions for several applications. We are performing parametric studies of near-fault motions from moderate earthquakes to investigate basin edge generated waves and large earthquakes to provide motions to engineers study building response. We show that 3D propagation near basin edges can generate significant amplifications relative to 1D analysis. SW4 is also being used to model earthquakes in the San Francisco Bay Area. This includes modeling moderate (M3.5-5) events to evaluate the United States Geologic Survey's 3D model of regional structure as well as strong motions from the 2014 South Napa earthquake and possible large scenario events. Recently SW4 was built on a Commodity Technology Systems-1 (CTS-1) at LLNL, new systems for capacity computing at the DOE National Labs. We find SW4 scales well and runs faster on these systems compared to the previous generation of LINUX clusters.
NASA Astrophysics Data System (ADS)
Luginbuhl, Molly; Rundle, John B.; Hawkins, Angela; Turcotte, Donald L.
2018-01-01
Nowcasting is a new method of statistically classifying seismicity and seismic risk (Rundle et al. 2016). In this paper, the method is applied to the induced seismicity at the Geysers geothermal region in California and the induced seismicity due to fluid injection in Oklahoma. Nowcasting utilizes the catalogs of seismicity in these regions. Two earthquake magnitudes are selected, one large say M_{λ } ≥ 4, and one small say M_{σ } ≥ 2. The method utilizes the number of small earthquakes that occurs between pairs of large earthquakes. The cumulative probability distribution of these values is obtained. The earthquake potential score (EPS) is defined by the number of small earthquakes that has occurred since the last large earthquake, the point where this number falls on the cumulative probability distribution of interevent counts defines the EPS. A major advantage of nowcasting is that it utilizes "natural time", earthquake counts, between events rather than clock time. Thus, it is not necessary to decluster aftershocks and the results are applicable if the level of induced seismicity varies in time. The application of natural time to the accumulation of the seismic hazard depends on the applicability of Gutenberg-Richter (GR) scaling. The increasing number of small earthquakes that occur after a large earthquake can be scaled to give the risk of a large earthquake occurring. To illustrate our approach, we utilize the number of M_{σ } ≥ 2.75 earthquakes in Oklahoma to nowcast the number of M_{λ } ≥ 4.0 earthquakes in Oklahoma. The applicability of the scaling is illustrated during the rapid build-up of injection-induced seismicity between 2012 and 2016, and the subsequent reduction in seismicity associated with a reduction in fluid injections. The same method is applied to the geothermal-induced seismicity at the Geysers, California, for comparison.
NASA Astrophysics Data System (ADS)
Bostenaru, M.
2009-04-01
, ductility and/or strength of the structure at different retrofit measures with its costs. In order to investigate the improvement in the seismic characteristics numerous simulations of the earthquake impact on reinforced concrete frame buildings were conducted and in that context conventional strengthening measures with reinforced concrete and steel were considered. In these reinforced concrete frame buildings interwar buildings from Bucharest were modelled, as these proved to be the most vulnerable in the initial investigation. For the investigation of the economic efficiency also the damages through earthquakes were simulated. With help of a characteristic of the software used so called performance points could be set, so at the end of the simulation it could be seen how strongly was damaged the steel and respectively the concrete in the reinforced concrete element and so was conducted a classification of the strength of the damages in different retrofit elements. These simulations were done for the 1977, 1986 and 1990 earthquakes, as for these the strong motion records were digitally available. For two simple models alternatives of retrofit actions and their locations were fully simulated, while for real building models customised retrofit strategies considering more retrofit elements within the strategy were employed. To the benefit belong not only the improvement of the structural behaviour, as often assumed in earthquake engineering circles. There belong also aesthetical and sociologic aspects. In order to give these aspects their rights, a decision tree was developed, in which the actors are the engineer, the architect, the investor and the user. The retrofit measures were evaluated with two different decision systems. This was the part about the applicability. Further research would serve to see how can be used the developed method for the strategic planning, in which not only single buildings but whole urban areas build the object. The research was funded by the
Post-Earthquake Assessment of Nevada Bridges Using ShakeMap/ShakeCast
DOT National Transportation Integrated Search
2016-01-01
Post-earthquake capacity of Nevada highway bridges is examined through a combination of engineering study and scenario earthquake evaluation. The study was undertaken by the University of Nevada Reno Department of Civil and Environmental Engineering ...
The 1906 earthquake and a century of progress in understanding earthquakes and their hazards
Zoback, M.L.
2006-01-01
The 18 April 1906 San Francisco earthquake killed nearly 3000 people and left 225,000 residents homeless. Three days after the earthquake, an eight-person Earthquake Investigation Commission composed of 25 geologists, seismologists, geodesists, biologists and engineers, as well as some 300 others started work under the supervision of Andrew Lawson to collect and document physical phenomena related to the quake . On 31 May 1906, the commission published a preliminary 17-page report titled "The Report of the State Earthquake Investigation Commission". The report included the bulk of the geological and morphological descriptions of the faulting, detailed reports on shaking intensity, as well as an impressive atlas of 40 oversized maps and folios. Nearly 100 years after its publication, the Commission Report remains a model for post-earthquake investigations. Because the diverse data sets were so complete and carefully documented, researchers continue to apply modern analysis techniques to learn from the 1906 earthquake. While the earthquake marked a seminal event in the history of California, it served as impetus for the birth of modern earthquake science in the United States.
Developing ShakeCast statistical fragility analysis framework for rapid post-earthquake assessment
Lin, K.-W.; Wald, D.J.
2012-01-01
When an earthquake occurs, the U. S. Geological Survey (USGS) ShakeMap estimates the extent of potentially damaging shaking and provides overall information regarding the affected areas. The USGS ShakeCast system is a freely-available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users’ facilities, sends notifications of potential damage to responsible parties, and generates facility damage assessment maps and other web-based products for emergency managers and responders. We describe notable improvements of the ShakeMap and the ShakeCast applications. We present a design for comprehensive fragility implementation, integrating spatially-varying ground-motion uncertainties into fragility curves for ShakeCast operations. For each facility, an overall inspection priority (or damage assessment) is assigned on the basis of combined component-based fragility curves using pre-defined logic. While regular ShakeCast users receive overall inspection priority designations for each facility, engineers can access the full fragility analyses for further evaluation.
Post-earthquake building safety inspection: Lessons from the Canterbury, New Zealand, earthquakes
Marshall, J.; Jaiswal, Kishor; Gould, N.; Turner, F.; Lizundia, B.; Barnes, J.
2013-01-01
The authors discuss some of the unique aspects and lessons of the New Zealand post-earthquake building safety inspection program that was implemented following the Canterbury earthquake sequence of 2010–2011. The post-event safety assessment program was one of the largest and longest programs undertaken in recent times anywhere in the world. The effort engaged hundreds of engineering professionals throughout the country, and also sought expertise from outside, to perform post-earthquake structural safety inspections of more than 100,000 buildings in the city of Christchurch and the surrounding suburbs. While the building safety inspection procedure implemented was analogous to the ATC 20 program in the United States, many modifications were proposed and implemented in order to assess the large number of buildings that were subjected to strong and variable shaking during a period of two years. This note discusses some of the key aspects of the post-earthquake building safety inspection program and summarizes important lessons that can improve future earthquake response.
NASA Astrophysics Data System (ADS)
Bossu, R.; Etivant, C.; Roussel, F.; Mazet-Roux, G.; Steed, R.
2014-12-01
Smartphone applications have swiftly become one of the most popular tools for rapid reception of earthquake information for the public. Wherever someone's own location is, they can be automatically informed when an earthquake has struck just by setting a magnitude threshold and an area of interest. No need to browse the internet: the information reaches you automatically and instantaneously! One question remains: are the provided earthquake notifications always relevant for the public? A while after damaging earthquakes many eyewitnesses scrap the application they installed just after the mainshock. Why? Because either the magnitude threshold is set too high and many felt earthquakes are missed, or it is set too low and the majority of the notifications are related to unfelt earthquakes thereby only increasing anxiety among the population at each new update. Felt and damaging earthquakes are the ones of societal importance even when of small magnitude. LastQuake app and Twitter feed (QuakeBot) focuses on these earthquakes that matter for the public by collating different information threads covering tsunamigenic, damaging and felt earthquakes. Non-seismic detections and macroseismic questionnaires collected online are combined to identify felt earthquakes regardless their magnitude. Non seismic detections include Twitter earthquake detections, developed by the USGS, where the number of tweets containing the keyword "earthquake" is monitored in real time and flashsourcing, developed by the EMSC, which detect traffic surges on its rapid earthquake information website caused by the natural convergence of eyewitnesses who rush to the Internet to investigate the cause of the shaking that they have just felt. We will present the identification process of the felt earthquakes, the smartphone application and the 27 automatically generated tweets and how, by providing better public services, we collect more data from citizens.
77 FR 64314 - Advisory Committee on Earthquake Hazards Reduction Meeting
Federal Register 2010, 2011, 2012, 2013, 2014
2012-10-19
... is to discuss engineering needs for existing buildings, to review the National Earthquake Hazards... Committee business. The final agenda will be posted on the NEHRP Web site at http://nehrp.gov/ . DATES: The... assesses: Trends and developments in the science and engineering of earthquake hazards reduction; The...
Sand Volcano Following Earthquake
NASA Technical Reports Server (NTRS)
1989-01-01
Sand boil or sand volcano measuring 2 m (6.6 ft.) in length erupted in median of Interstate Highway 80 west of the Bay Bridge toll plaza when ground shaking transformed loose water-saturated deposit of subsurface sand into a sand-water slurry (liquefaction) in the October 17, 1989, Loma Prieta earthquake. Vented sand contains marine-shell fragments. Sand and soil grains have faces that can cause friction as they roll and slide against each other, or even cause sticking and form small voids between grains. This complex behavior can cause soil to behave like a liquid under certain conditions such as earthquakes or when powders are handled in industrial processes. Mechanics of Granular Materials (MGM) experiments aboard the Space Shuttle use the microgravity of space to simulate this behavior under conditions that carnot be achieved in laboratory tests on Earth. MGM is shedding light on the behavior of fine-grain materials under low effective stresses. Applications include earthquake engineering, granular flow technologies (such as powder feed systems for pharmaceuticals and fertilizers), and terrestrial and planetary geology. Nine MGM specimens have flown on two Space Shuttle flights. Another three are scheduled to fly on STS-107. The principal investigator is Stein Sture of the University of Colorado at Boulder. (Credit: J.C. Tinsley, U.S. Geological Survey)
UNLV’s environmentally friendly Science and Engineering Building is monitored for earthquake shaking
Kalkan, Erol; Savage, Woody; Reza, Shahneam; Knight, Eric; Tian, Ying
2013-01-01
The University of Nevada Las Vegas’ (UNLV) Science and Engineering Building is at the cutting edge of environmentally friendly design. As the result of a recent effort by the U.S. Geological Survey’s National Strong Motion Project in cooperation with UNLV, the building is now also in the forefront of buildings installed with structural monitoring systems to measure response during earthquakes. This is particularly important because this is the first such building in Las Vegas. The seismic instrumentation will provide essential data to better understand the structural performance of buildings, especially in this seismically active region.
NASA Astrophysics Data System (ADS)
Bostenaru Dan, M.
2009-04-01
mitigation will be presented. The session includes contributions showing methodological and modelling approaches from scientists in geophysical/seismological, hydrological, remote sensing, civil engineering, insurance, and urbanism, amongst other fields, as well as presentations from practitioners working on specific case studies, regarding analysis of recent events and their impact on cities as well as re-evaluation of past events from the point of view of long-time recovery. In 2005 it was called for: Most strategies for both preparedness and emergency management in case of disaster mitigation are related to urban planning. While natural, engineering and social sciences contribute to the evaluation of the impact of earthquakes and their secondary events (including tsunamis, earthquake triggered landslides, or fire), floods, landslides, high winds, and volcanic eruptions on urban areas, there are the instruments of urban planning which are to be employed for both visualisation as well as development and implementation of strategy concepts for pre- and postdisaster intervention. The evolution of natural systems towards extreme conditions is taken into consideration so far at it concerns the damaging impact on urban areas and infrastructure and the impact on the natural environment of interventions to reduce such damaging impact.
GLASS 2.0: An Operational, Multimodal, Bayesian Earthquake Data Association Engine
NASA Astrophysics Data System (ADS)
Benz, H.; Johnson, C. E.; Patton, J. M.; McMahon, N. D.; Earle, P. S.
2015-12-01
The legacy approach to automated detection and determination of hypocenters is arrival time stacking algorithms. Examples of such algorithms are the associator, Binder, which has been in continuous use in many USGS-supported regional seismic networks since the 1980s and the spherical earth successor, GLASS 1.0, currently in service at the USGS National Earthquake Information Center for over 10 years. The principle short-comings of the legacy approach are 1) it can only use phase arrival times, 2) it does not adequately address the problems of extreme variations in station density worldwide, 3) it cannot incorporate multiple phase models or statistical attributes of phases with distance, and 4) it cannot incorporate noise model attributes of individual stations. Previously we introduced a theoretical framework of a new associator using a Bayesian kernel stacking approach to approximate a joint probability density function for hypocenter localization. More recently we added station- and phase-specific Bayesian constraints to the association process. GLASS 2.0 incorporates a multiplicity of earthquake related data including phase arrival times, back-azimuth and slowness information from array beamforming, arrival times from waveform cross correlation processing, and geographic constraints from real-time social media reports of ground shaking. We demonstrate its application by modeling an aftershock sequence using dozens of stations that recorded tens of thousands of earthquakes over a period of one month. We also demonstrate Glass 2.0 performance regionally and teleseismically using the globally distributed real-time monitoring system at NEIC.
Research on response spectrum of dam based on scenario earthquake
NASA Astrophysics Data System (ADS)
Zhang, Xiaoliang; Zhang, Yushan
2017-10-01
Taking a large hydropower station as an example, the response spectrum based on scenario earthquake is determined. Firstly, the potential source of greatest contribution to the site is determined on the basis of the results of probabilistic seismic hazard analysis (PSHA). Secondly, the magnitude and epicentral distance of the scenario earthquake are calculated according to the main faults and historical earthquake of the potential seismic source zone. Finally, the response spectrum of scenario earthquake is calculated using the Next Generation Attenuation (NGA) relations. The response spectrum based on scenario earthquake method is less than the probability-consistent response spectrum obtained by PSHA method. The empirical analysis shows that the response spectrum of scenario earthquake considers the probability level and the structural factors, and combines the advantages of the deterministic and probabilistic seismic hazard analysis methods. It is easy for people to accept and provide basis for seismic engineering of hydraulic engineering.
National Earthquake Hazards Reduction Program; time to expand
Steinbrugge, K.V.
1990-01-01
All of us in earthquake engineering, seismology, and many related disciplines have been directly or indirectly affected by the National Earthquake Hazards Reduction Program (NEHRP). This program was the result of the Earthquake Hazards Reduction Act of 1977 (Public Law 95-124). With well over a decade of experience, should this expression of public policy now take a different or expanded role?
Initiatives to Reduce Earthquake Risk of Developing Countries
NASA Astrophysics Data System (ADS)
Tucker, B. E.
2008-12-01
The seventeen-year-and-counting history of the Palo Alto-based nonprofit organization GeoHazards International (GHI) is the story of many initiatives within a larger initiative to increase the societal impact of geophysics and civil engineering. GHI's mission is to reduce death and suffering due to earthquakes and other natural hazards in the world's most vulnerable communities through preparedness, mitigation and advocacy. GHI works by raising awareness in these communities about their risk and about affordable methods to manage it, identifying and strengthening institutions in these communities to manage their risk, and advocating improvement in natural disaster management. Some of GHI's successful initiatives include: (1) creating an earthquake scenario for Quito, Ecuador that describes in lay terms the consequences for that city of a probable earthquake; (2) improving the curricula of Pakistani university courses about seismic retrofitting; (3) training employees of the Public Works Department of Delhi, India on assessing the seismic vulnerability of critical facilities such as a school, a hospital, a police headquarters, and city hall; (4) assessing the vulnerability of the Library of Tibetan Works and Archives in Dharamsala, India; (5) developing a seismic hazard reduction plan for a nonprofit organization in Kathmandu, Nepal that works to manage Nepal's seismic risk; and (6) assisting in the formulation of a resolution by the Council of the Organization for Economic Cooperation and Development (OECD) to promote school earthquake safety among OECD member countries. GHI's most important resource, in addition to its staff and Board of Trustees, is its members and volunteer advisors, who include some of the world's leading earth scientists, earthquake engineers, urban planners and architects, from the academic, public, private and nonprofit sectors. GHI is planning several exciting initiatives in the near future. One would oversee the design and construction of
Earthquakes in Arkansas and vicinity 1699-2010
Dart, Richard L.; Ausbrooks, Scott M.
2011-01-01
This map summarizes approximately 300 years of earthquake activity in Arkansas. It is one in a series of similar State earthquake history maps. Work on the Arkansas map was done in collaboration with the Arkansas Geological Survey. The earthquake data plotted on the map are from several sources: the Arkansas Geological Survey, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Mississippi Department of Environmental Quality. In addition to earthquake locations, other materials presented include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Arkansas and parts of adjacent states. Arkansas has undergone a number of significant felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Arkansas and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.
Rapid tsunami models and earthquake source parameters: Far-field and local applications
Geist, E.L.
2005-01-01
Rapid tsunami models have recently been developed to forecast far-field tsunami amplitudes from initial earthquake information (magnitude and hypocenter). Earthquake source parameters that directly affect tsunami generation as used in rapid tsunami models are examined, with particular attention to local versus far-field application of those models. First, validity of the assumption that the focal mechanism and type of faulting for tsunamigenic earthquakes is similar in a given region can be evaluated by measuring the seismic consistency of past events. Second, the assumption that slip occurs uniformly over an area of rupture will most often underestimate the amplitude and leading-wave steepness of the local tsunami. Third, sometimes large magnitude earthquakes will exhibit a high degree of spatial heterogeneity such that tsunami sources will be composed of distinct sub-events that can cause constructive and destructive interference in the wavefield away from the source. Using a stochastic source model, it is demonstrated that local tsunami amplitudes vary by as much as a factor of two or more, depending on the local bathymetry. If other earthquake source parameters such as focal depth or shear modulus are varied in addition to the slip distribution patterns, even greater uncertainty in local tsunami amplitude is expected for earthquakes of similar magnitude. Because of the short amount of time available to issue local warnings and because of the high degree of uncertainty associated with local, model-based forecasts as suggested by this study, direct wave height observations and a strong public education and preparedness program are critical for those regions near suspected tsunami sources.
Earthquakes in Mississippi and vicinity 1811-2010
Dart, Richard L.; Bograd, Michael B.E.
2011-01-01
This map summarizes two centuries of earthquake activity in Mississippi. Work on the Mississippi map was done in collaboration with the Mississippi Department of Environmental Quality, Office of Geology. The earthquake data plotted on the map are from several sources: the Mississippi Department of Environmental Quality, the Center for Earthquake Research and Information, the National Center for Earthquake Engineering Research, and the Arkansas Geological Survey. In addition to earthquake locations, other materials include seismic hazard and isoseismal maps and related text. Earthquakes are a legitimate concern in Mississippi and parts of adjacent States. Mississippi has undergone a number of felt earthquakes since 1811. At least two of these events caused property damage: a magnitude 4.7 earthquake in 1931, and a magnitude 4.3 earthquake in 1967. The map shows all historical and instrumentally located earthquakes in Mississippi and vicinity between 1811 and 2010. The largest historic earthquake in the vicinity of the State was an intensity XI event, on December 16, 1811; the first earthquake in the New Madrid sequence. This violent event and the earthquakes that followed caused considerable damage to the then sparsely settled region.
Historical earthquake research in Austria
NASA Astrophysics Data System (ADS)
Hammerl, Christa
2017-12-01
Austria has a moderate seismicity, and on average the population feels 40 earthquakes per year or approximately three earthquakes per month. A severe earthquake with light building damage is expected roughly every 2 to 3 years in Austria. Severe damage to buildings ( I 0 > 8° EMS) occurs significantly less frequently, the average period of recurrence is about 75 years. For this reason the historical earthquake research has been of special importance in Austria. The interest in historical earthquakes in the past in the Austro-Hungarian Empire is outlined, beginning with an initiative of the Austrian Academy of Sciences and the development of historical earthquake research as an independent research field after the 1978 "Zwentendorf plebiscite" on whether the nuclear power plant will start up. The applied methods are introduced briefly along with the most important studies and last but not least as an example of a recently carried out case study, one of the strongest past earthquakes in Austria, the earthquake of 17 July 1670, is presented. The research into historical earthquakes in Austria concentrates on seismic events of the pre-instrumental period. The investigations are not only of historical interest, but also contribute to the completeness and correctness of the Austrian earthquake catalogue, which is the basis for seismic hazard analysis and as such benefits the public, communities, civil engineers, architects, civil protection, and many others.
Application of τc*Pd for identifying damaging earthquakes for earthquake early warning
NASA Astrophysics Data System (ADS)
Huang, P. L.; Lin, T. L.; Wu, Y. M.
2014-12-01
Earthquake Early Warning System (EEWS) is an effective approach to mitigate earthquake damage. In this study, we used the seismic record by the Kiban Kyoshin network (KiK-net), because it has dense station coverage and co-located borehole strong-motion seismometers along with the free-surface strong-motion seismometers. We used inland earthquakes with moment magnitude (Mw) from 5.0 to 7.3 between 1998 and 2012. We choose 135 events and 10950 strong ground accelerograms recorded by the 696 strong ground accelerographs. Both the free-surface and the borehole data are used to calculate τc and Pd, respectively. The results show that τc*Pd has a good correlation with PGV and is a robust parameter for assessing the potential of damaging earthquake. We propose the value of τc*Pd determined from seconds after the arrival of P wave could be a threshold for the on-site type of EEW.
Application of Earthquake Subspace Detectors at Kilauea and Mauna Loa Volcanoes, Hawai`i
NASA Astrophysics Data System (ADS)
Okubo, P.; Benz, H.; Yeck, W.
2016-12-01
Recent studies have demonstrated the capabilities of earthquake subspace detectors for detailed cataloging and tracking of seismicity in a number of regions and settings. We are exploring the application of subspace detectors at the United States Geological Survey's Hawaiian Volcano Observatory (HVO) to analyze seismicity at Kilauea and Mauna Loa volcanoes. Elevated levels of microseismicity and occasional swarms of earthquakes associated with active volcanism here present cataloging challenges due the sheer numbers of earthquakes and an intrinsically low signal-to-noise environment featuring oceanic microseism and volcanic tremor in the ambient seismic background. With high-quality continuous recording of seismic data at HVO, we apply subspace detectors (Harris and Dodge, 2011, Bull. Seismol. Soc. Am., doi: 10.1785/0120100103) during intervals of noteworthy seismicity. Waveform templates are drawn from Magnitude 2 and larger earthquakes within clusters of earthquakes cataloged in the HVO seismic database. At Kilauea, we focus on seismic swarms in the summit caldera region where, despite continuing eruptions from vents in the summit region and in the east rift zone, geodetic measurements reflect a relatively inflated volcanic state. We also focus on seismicity beneath and adjacent to Mauna Loa's summit caldera that appears to be associated with geodetic expressions of gradual volcanic inflation, and where precursory seismicity clustered prior to both Mauna Loa's most recent eruptions in 1975 and 1984. We recover several times more earthquakes with the subspace detectors - down to roughly 2 magnitude units below the templates, based on relative amplitudes - compared to the numbers of cataloged earthquakes. The increased numbers of detected earthquakes in these clusters, and the ability to associate and locate them, allow us to infer details of the spatial and temporal distributions and possible variations in stresses within these key regions of the volcanoes.
Sizing up earthquake damage: Differing points of view
Hough, S.; Bolen, A.
2007-01-01
When a catastrophic event strikes an urban area, many different professionals hit the ground running. Emergency responders respond, reporters report, and scientists and engineers collect and analyze data. Journalists and scientists may share interest in these events, but they have very different missions. To a journalist, earthquake damage is news. To a scientist or engineer, earthquake damage represents a valuable source of data that can help us understand how strongly the ground shook as well as how particular structures responded to the shaking.
Rezaeian, Sanaz; Zhong, Peng; Hartzell, Stephen; Zareian, Farzin
2015-01-01
Simulated earthquake ground motions can be used in many recent engineering applications that require time series as input excitations. However, applicability and validation of simulations are subjects of debate in the seismological and engineering communities. We propose a validation methodology at the waveform level and directly based on characteristics that are expected to influence most structural and geotechnical response parameters. In particular, three time-dependent validation metrics are used to evaluate the evolving intensity, frequency, and bandwidth of a waveform. These validation metrics capture nonstationarities in intensity and frequency content of waveforms, making them ideal to address nonlinear response of structural systems. A two-component error vector is proposed to quantify the average and shape differences between these validation metrics for a simulated and recorded ground-motion pair. Because these metrics are directly related to the waveform characteristics, they provide easily interpretable feedback to seismologists for modifying their ground-motion simulation models. To further simplify the use and interpretation of these metrics for engineers, it is shown how six scalar key parameters, including duration, intensity, and predominant frequency, can be extracted from the validation metrics. The proposed validation methodology is a step forward in paving the road for utilization of simulated ground motions in engineering practice and is demonstrated using examples of recorded and simulated ground motions from the 1994 Northridge, California, earthquake.
Earthquake: Game-based learning for 21st century STEM education
NASA Astrophysics Data System (ADS)
Perkins, Abigail Christine
To play is to learn. A lack of empirical research within game-based learning literature, however, has hindered educational stakeholders to make informed decisions about game-based learning for 21st century STEM education. In this study, I modified a research and development (R&D) process to create a collaborative-competitive educational board game illuminating elements of earthquake engineering. I oriented instruction- and game-design principles around 21st century science education to adapt the R&D process to develop the educational game, Earthquake. As part of the R&D, I evaluated Earthquake for empirical evidence to support the claim that game-play results in student gains in critical thinking, scientific argumentation, metacognitive abilities, and earthquake engineering content knowledge. I developed Earthquake with the aid of eight focus groups with varying levels of expertise in science education research, teaching, administration, and game-design. After developing a functional prototype, I pilot-tested Earthquake with teacher-participants (n=14) who engaged in semi-structured interviews after their game-play. I analyzed teacher interviews with constant comparison methodology. I used teachers' comments and feedback from content knowledge experts to integrate game modifications, implementing results to improve Earthquake. I added player roles, simplified phrasing on cards, and produced an introductory video. I then administered the modified Earthquake game to two groups of high school student-participants (n = 6), who played twice. To seek evidence documenting support for my knowledge claim, I analyzed videotapes of students' game-play using a game-based learning checklist. My assessment of learning gains revealed increases in all categories of students' performance: critical thinking, metacognition, scientific argumentation, and earthquake engineering content knowledge acquisition. Players in both student-groups improved mostly in critical thinking, having
ERTS Applications in earthquake research and mineral exploration in California
NASA Technical Reports Server (NTRS)
Abdel-Gawad, M.; Silverstein, J.
1973-01-01
Examples that ERTS imagery can be effectively utilized to identify, locate, and map faults which show geomorphic evidence of geologically recent breakage are presented. Several important faults not previously known have been identified. By plotting epicenters of historic earthquakes in parts of California, Sonora, Mexico, Arizona, and Nevada, we found that areas known for historic seismicity are often characterized by abundant evidence of recent fault and crustal movements. There are many examples of seismically quiet areas where outstanding evidence of recent fault movements is observed. One application is clear: ERTS-1 imagery could be effectively utilized to delineate areas susceptible to earthquake recurrence which, on the basis of seismic data alone, may be misleadingly considered safe. ERTS data can also be utilized in planning new sites in the geophysical network of fault movement monitoring and strain and tilt measurements.
NASA Astrophysics Data System (ADS)
Xu, Peibin; Wen, Ruizhi; Wang, Hongwei; Ji, Kun; Ren, Yefei
2015-02-01
The Ludian County of Yunnan Province in southwestern China was struck by an M S6.5 earthquake on August 3, 2014, which was another destructive event following the M S8.0 Wenchuan earthquake in 2008, M S7.1 Yushu earthquake in 2010, and M S7.0 Lushan earthquake in 2013. National Strong-Motion Observation Network System of China collected 74 strong motion recordings, which the maximum peak ground acceleration recorded by the 053LLT station in Longtoushan Town was 949 cm/s2 in E-W component. The observed PGAs and spectral ordinates were compared with ground-motion prediction equation in China and the NGA-West2 developed by Pacific Earthquake Engineering Researcher Center. This earthquake is considered as the first case for testing applicability of NGA-West2 in China. Results indicate that the observed PGAs and the 5 % damped pseudo-response spectral accelerations are significantly lower than the predicted ones. The field survey around some typical strong motion stations verified that the earthquake damage was consistent with the official isoseismal by China Earthquake Administration.
Application of GPS Technologies to study Pre-earthquake processes. A review and future prospects
NASA Astrophysics Data System (ADS)
Pulinets, S. A.; Liu, J. Y. G.; Ouzounov, D.; Hernandez-Pajares, M.; Hattori, K.; Krankowski, A.; Zakharenkova, I.; Cherniak, I.
2016-12-01
We present the progress reached by the GPS TEC technologies in study of pre-seismic anomalies in the ionosphere appearing few days before the strong earthquakes. Starting from the first case studies such as 17 August 1999 M7.6 Izmit earthquake in Turkey the technology has been developed and converted into the global near real-time monitoring of seismo-ionospheric effects which is used now in the multiparameter nowcast and forecast of the strong earthquakes. Development of the techniques of the seismo-ionospheric anomalies identification was carried out in parallel with the development of the physical mechanism explaining these anomalies generation. It was established that the seismo-ionospheric anomalies have a self-similarity property, are dependent on the local time and are persistent at least for 4 hours, deviation from undisturbed level could be both positive and negative depending on the leading time (in days) to the moment of impending earthquake and from longitude of anomaly in relation to the epicenter longitude. Low latitude and near equatorial earthquakes demonstrate the magnetically conjugated effect, while the middle and high latitude earthquakes demonstrate the single anomaly over the earthquake preparation zone. From the anomalies morphology the physical mechanism was derived within the framework of the more complex Lithosphere-Atmosphere-Ionosphere-Magnetosphere Coupling concept. In addition to the multifactor analysis of the GPS TEC time series the GIM MAP technology was applied also clearly showing the seismo-ionospheric anomalies locality and their spatial size correspondence to the Dobrovolsky determination of the earthquake preparation zone radius. Application of ionospheric tomography techniques permitted to study not only the total electron content variations but also the modification of the vertical distribution of electron concentration in the ionosphere before earthquakes. The statistical check of the ionospheric precursors passed the
An Investigation on the Crustal Deformations in Istanbul after Eastern Marmara Earthquakes in 1999
NASA Astrophysics Data System (ADS)
Ozludemir, M.; Ozyasar, M.
2008-12-01
Since the introduction of the GPS technique in mid 1970's there has been great advances in positioning activities. Today such Global Navigational Satellite Systems (GNSS) based positioning techniques are widely used in daily geodetic applications. High order geodetic network measurements are one of such geodetic applications. Such networks are established to provide reliable infrastructures for all kind of geodetic work from the production of cadastral plans to the surveying processes during the construction of engineering structures. In fact such positional information obtained in such engineering surveys could be useful for other studies as well. One of such fields is geodynamic studies where such positional information could be valuable to understand the characteristics of tectonic movements. In Turkey being located in a tectonically active zones and having major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. In this paper an example of such engineering surveys is discussed. This example is the Istanbul GPS (Global Positioning System) Network, first established in 1997 and remeasured in 2005. Between these two measurement processes two major earthquakes took place, on August 17 and November 12, 1999 with magnitudes of 7.4 and 7.2, respectively. In the first measurement campaign in 1997, a network of about 700 points was measured, while in the second campaign in 2005 more than 1800 points were positioned. In these two campaigns are existing common points. The network covers the whole Istanbul area of about 6000 km2. All network points are located on the Eurasian plate to the north of the North Anatolian Fault Zone. In this study, the horizontal and vertical movements are presented and compared with the results obtained in geodynamic studies.
USGS Earthquake Program GPS Use Case : Earthquake Early Warning
DOT National Transportation Integrated Search
2015-03-12
USGS GPS receiver use case. Item 1 - High Precision User (federal agency with Stafford Act hazard alert responsibilities for earthquakes, volcanoes and landslides nationwide). Item 2 - Description of Associated GPS Application(s): The USGS Eart...
Celsi, R.; Wolfinbarger, M.; Wald, D.
2005-01-01
The purpose of this research is to explore earthquake risk perceptions in California. Specifically, we examine the risk beliefs, feelings, and experiences of lay, professional, and expert individuals to explore how risk is perceived and how risk perceptions are formed relative to earthquakes. Our results indicate that individuals tend to perceptually underestimate the degree that earthquake (EQ) events may affect them. This occurs in large part because individuals' personal felt experience of EQ events are generally overestimated relative to experienced magnitudes. An important finding is that individuals engage in a process of "cognitive anchoring" of their felt EQ experience towards the reported earthquake magnitude size. The anchoring effect is moderated by the degree that individuals comprehend EQ magnitude measurement and EQ attenuation. Overall, the results of this research provide us with a deeper understanding of EQ risk perceptions, especially as they relate to individuals' understanding of EQ measurement and attenuation concepts. ?? 2005, Earthquake Engineering Research Institute.
Ellsworth, William L.
2013-01-01
Earthquakes in unusual locations have become an important topic of discussion in both North America and Europe, owing to the concern that industrial activity could cause damaging earthquakes. It has long been understood that earthquakes can be induced by impoundment of reservoirs, surface and underground mining, withdrawal of fluids and gas from the subsurface, and injection of fluids into underground formations. Injection-induced earthquakes have, in particular, become a focus of discussion as the application of hydraulic fracturing to tight shale formations is enabling the production of oil and gas from previously unproductive formations. Earthquakes can be induced as part of the process to stimulate the production from tight shale formations, or by disposal of wastewater associated with stimulation and production. Here, I review recent seismic activity that may be associated with industrial activity, with a focus on the disposal of wastewater by injection in deep wells; assess the scientific understanding of induced earthquakes; and discuss the key scientific challenges to be met for assessing this hazard.
Stability assessment of structures under earthquake hazard through GRID technology
NASA Astrophysics Data System (ADS)
Prieto Castrillo, F.; Boton Fernandez, M.
2009-04-01
This work presents a GRID framework to estimate the vulnerability of structures under earthquake hazard. The tool has been designed to cover the needs of a typical earthquake engineering stability analysis; preparation of input data (pre-processing), response computation and stability analysis (post-processing). In order to validate the application over GRID, a simplified model of structure under artificially generated earthquake records has been implemented. To achieve this goal, the proposed scheme exploits the GRID technology and its main advantages (parallel intensive computing, huge storage capacity and collaboration analysis among institutions) through intensive interaction among the GRID elements (Computing Element, Storage Element, LHC File Catalogue, federated database etc.) The dynamical model is described by a set of ordinary differential equations (ODE's) and by a set of parameters. Both elements, along with the integration engine, are encapsulated into Java classes. With this high level design, subsequent improvements/changes of the model can be addressed with little effort. In the procedure, an earthquake record database is prepared and stored (pre-processing) in the GRID Storage Element (SE). The Metadata of these records is also stored in the GRID federated database. This Metadata contains both relevant information about the earthquake (as it is usual in a seismic repository) and also the Logical File Name (LFN) of the record for its later retrieval. Then, from the available set of accelerograms in the SE, the user can specify a range of earthquake parameters to carry out a dynamic analysis. This way, a GRID job is created for each selected accelerogram in the database. At the GRID Computing Element (CE), displacements are then obtained by numerical integration of the ODE's over time. The resulting response for that configuration is stored in the GRID Storage Element (SE) and the maximum structure displacement is computed. Then, the corresponding
Nonlinear waves in earth crust faults: application to regular and slow earthquakes
NASA Astrophysics Data System (ADS)
Gershenzon, Naum; Bambakidis, Gust
2015-04-01
The genesis, development and cessation of regular earthquakes continue to be major problems of modern geophysics. How are earthquakes initiated? What factors determine the rapture velocity, slip velocity, rise time and geometry of rupture? How do accumulated stresses relax after the main shock? These and other questions still need to be answered. In addition, slow slip events have attracted much attention as an additional source for monitoring fault dynamics. Recently discovered phenomena such as deep non-volcanic tremor (NVT), low frequency earthquakes (LFE), very low frequency earthquakes (VLF), and episodic tremor and slip (ETS) have enhanced and complemented our knowledge of fault dynamic. At the same time, these phenomena give rise to new questions about their genesis, properties and relation to regular earthquakes. We have developed a model of macroscopic dry friction which efficiently describes laboratory frictional experiments [1], basic properties of regular earthquakes including post-seismic stress relaxation [3], the occurrence of ambient and triggered NVT [4], and ETS events [5, 6]. Here we will discuss the basics of the model and its geophysical applications. References [1] Gershenzon N.I. & G. Bambakidis (2013) Tribology International, 61, 11-18, http://dx.doi.org/10.1016/j.triboint.2012.11.025 [2] Gershenzon, N.I., G. Bambakidis and T. Skinner (2014) Lubricants 2014, 2, 1-x manuscripts; doi:10.3390/lubricants20x000x; arXiv:1411.1030v2 [3] Gershenzon N.I., Bykov V. G. and Bambakidis G., (2009) Physical Review E 79, 056601 [4] Gershenzon, N. I, G. Bambakidis, (2014a), Bull. Seismol. Soc. Am., 104, 4, doi: 10.1785/0120130234 [5] Gershenzon, N. I.,G. Bambakidis, E. Hauser, A. Ghosh, and K. C. Creager (2011), Geophys. Res. Lett., 38, L01309, doi:10.1029/2010GL045225. [6] Gershenzon, N.I. and G. Bambakidis (2014) Bull. Seismol. Soc. Am., (in press); arXiv:1411.1020
Ma, Jiaqi; Zhou, Maigeng; Li, Yanfei; Guo, Yan; Su, Xuemei; Qi, Xiaopeng; Ge, Hui
2009-05-01
To describe the design and application of an emergency response mobile phone-based information system for infectious disease reporting. Software engineering and business modeling were used to design and develop the emergency response mobile phone-based information system for infectious disease reporting. Seven days after the initiation of the reporting system, the reporting rate in the earthquake zone reached the level of the same period in 2007, using the mobile phone-based information system. Surveillance of the weekly report on morbidity in the earthquake zone after the initiation of the mobile phone reporting system showed the same trend as the previous three years. The emergency response mobile phone-based information system for infectious disease reporting was an effective solution to transmit urgently needed reports and manage communicable disease surveillance information. This assured the consistency of disease surveillance and facilitated sensitive, accurate, and timely disease surveillance. It is an important backup for the internet-based direct reporting system for communicable disease. © 2009 Blackwell Publishing Asia Pty Ltd and Chinese Cochrane Center, West China Hospital of Sichuan University.
Jaiswal, Kishor; Wald, David J.; Earle, Paul S.; Porter, Keith A.; Hearne, Mike
2011-01-01
Since the launch of the USGS’s Prompt Assessment of Global Earthquakes for Response (PAGER) system in fall of 2007, the time needed for the U.S. Geological Survey (USGS) to determine and comprehend the scope of any major earthquake disaster anywhere in the world has been dramatically reduced to less than 30 min. PAGER alerts consist of estimated shaking hazard from the ShakeMap system, estimates of population exposure at various shaking intensities, and a list of the most severely shaken cities in the epicentral area. These estimates help government, scientific, and relief agencies to guide their responses in the immediate aftermath of a significant earthquake. To account for wide variability and uncertainty associated with inventory, structural vulnerability and casualty data, PAGER employs three different global earthquake fatality/loss computation models. This article describes the development of the models and demonstrates the loss estimation capability for earthquakes that have occurred since 2007. The empirical model relies on country-specific earthquake loss data from past earthquakes and makes use of calibrated casualty rates for future prediction. The semi-empirical and analytical models are engineering-based and rely on complex datasets including building inventories, time-dependent population distributions within different occupancies, the vulnerability of regional building stocks, and casualty rates given structural collapse.
Assessing the Utility of and Improving USGS Earthquake Hazards Program Products
NASA Astrophysics Data System (ADS)
Gomberg, J. S.; Scott, M.; Weaver, C. S.; Sherrod, B. L.; Bailey, D.; Gibbons, D.
2010-12-01
A major focus of the USGS Earthquake Hazards Program (EHP) has been the development and implementation of products and information meant to improve earthquake hazard assessment, mitigation and response for a myriad of users. Many of these products rely on the data and efforts of the EHP and its partner scientists who are building the Advanced National Seismic System (ANSS). We report on a project meant to assess the utility of many of these products and information, conducted collaboratively by EHP scientists and Pierce County Department of Emergency Management staff. We have conducted focus group listening sessions with members of the engineering, business, medical, media, risk management, and emergency response communities as well as participated in the planning and implementation of earthquake exercises in the Pacific Northwest. Thus far we have learned that EHP and ANSS products satisfy many of the needs of engineers and some planners, and information is widely used by media and the general public. However, some important communities do not use these products despite their intended application for their purposes, particularly county and local emergency management and business communities. We have learned that products need to convey more clearly the impact of earthquakes, in everyday terms. Users also want products (e.g. maps, forecasts, etc.) that can be incorporated into tools and systems they use regularly. Rather than simply building products and posting them on websites, products need to be actively marketed and training provided. We suggest that engaging users prior to and during product development will enhance their usage and effectiveness.
Earthquake Education in Prime Time
NASA Astrophysics Data System (ADS)
de Groot, R.; Abbott, P.; Benthien, M.
2004-12-01
Since 2001, the Southern California Earthquake Center (SCEC) has collaborated on several video production projects that feature important topics related to earthquake science, engineering, and preparedness. These projects have also fostered many fruitful and sustained partnerships with a variety of organizations that have a stake in hazard education and preparedness. The Seismic Sleuths educational video first appeared in the spring season 2001 on Discovery Channel's Assignment Discovery. Seismic Sleuths is based on a highly successful curriculum package developed jointly by the American Geophysical Union and The Department of Homeland Security Federal Emergency Management Agency. The California Earthquake Authority (CEA) and the Institute for Business and Home Safety supported the video project. Summer Productions, a company with a reputation for quality science programming, produced the Seismic Sleuths program in close partnership with scientists, engineers, and preparedness experts. The program has aired on the National Geographic Channel as recently as Fall 2004. Currently, SCEC is collaborating with Pat Abbott, a geology professor at San Diego State University (SDSU) on the video project Written In Stone: Earthquake Country - Los Angeles. Partners on this project include the California Seismic Safety Commission, SDSU, SCEC, CEA, and the Insurance Information Network of California. This video incorporates live-action demonstrations, vivid animations, and a compelling host (Abbott) to tell the story about earthquakes in the Los Angeles region. The Written in Stone team has also developed a comprehensive educator package that includes the video, maps, lesson plans, and other supporting materials. We will present the process that facilitates the creation of visually effective, factually accurate, and entertaining video programs. We acknowledge the need to have a broad understanding of the literature related to communication, media studies, science education, and
NASA Astrophysics Data System (ADS)
Abaimov, Sergey G.
The concept of self-organized criticality is associated with scale-invariant, fractal behavior; this concept is also applicable to earthquake systems. It is known that the interoccurrent frequency-size distribution of earthquakes in a region is scale-invariant and obeys the Gutenberg-Richter power-law dependence. Also, the interoccurrent time-interval distribution is known to obey Poissonian statistics excluding aftershocks. However, to estimate the hazard risk for a region it is necessary to know also the recurrent behavior of earthquakes at a given point on a fault. This behavior has been investigated in the literature, however, major questions remain unresolved. The reason is the small number of earthquakes in observed sequences. To overcome this difficulty this research utilizes numerical simulations of a slider-block model and a sand-pile model. Also, experimental observations of creep events on the creeping section of the San Andreas fault are processed and sequences up to 100 events are studied. Then the recurrent behavior of earthquakes at a given point on a fault or at a given fault is investigated. It is shown that both the recurrent frequency-size and the time-interval behaviors of earthquakes obey the Weibull distribution.
Engineering Lessons Learned and Systems Engineering Applications
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Garcia, Danny; Vaughan, William W.
2005-01-01
Systems Engineering is fundamental to good engineering, which in turn depends on the integration and application of engineering lessons learned. Thus, good Systems Engineering also depends on systems engineering lessons learned from within the aerospace industry being documented and applied. About ten percent of the engineering lessons learned documented in the NASA Lessons Learned Information System are directly related to Systems Engineering. A key issue associated with lessons learned datasets is the communication and incorporation of this information into engineering processes. As part of the NASA Technical Standards Program activities, engineering lessons learned datasets have been identified from a number of sources. These are being searched and screened for those having a relation to Technical Standards. This paper will address some of these Systems Engineering Lessons Learned and how they are being related to Technical Standards within the NASA Technical Standards Program, including linking to the Agency's Interactive Engineering Discipline Training Courses and the life cycle for a flight vehicle development program.
Celebi, M.
2004-01-01
The recorded responses of an Anchorage, Alaska, building during four significant earthquakes that occurred in 2002 are studied. Two earthquakes, including the 3 November 2002 M7.9 Denali fault earthquake, with epicenters approximately 275 km from the building, generated long trains of long-period (>1 s) surface waves. The other two smaller earthquakes occurred at subcrustal depths practically beneath Anchorage and produced higher frequency motions. These two pairs of earthquakes have different impacts on the response of the building. Higher modes are more pronounced in the building response during the smaller nearby events. The building responses indicate that the close-coupling of translational and torsional modes causes a significant beating effect. It is also possible that there is some resonance occurring due to the site frequency being close to the structural frequency. Identification of dynamic characteristics and behavior of buildings can provide important lessons for future earthquake-resistant designs and retrofit of existing buildings. ?? 2004, Earthquake Engineering Research Institute.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-03
...: Survey of Principal Investigators on Earthquake Engineering Research Awards Made by the National Science... survey of Principal Investigators on NSF earthquake engineering research awards, including but not... NATIONAL SCIENCE FOUNDATION Submission for OMB Review; Comment Request Survey of Principal...
NASA Astrophysics Data System (ADS)
Shanker, D.; Paudyal, ,; Singh, H.
2010-12-01
It is not only the basic understanding of the phenomenon of earthquake, its resistance offered by the designed structure, but the understanding of the socio-economic factors, engineering properties of the indigenous materials, local skill and technology transfer models are also of vital importance. It is important that the engineering aspects of mitigation should be made a part of public policy documents. Earthquakes, therefore, are and were thought of as one of the worst enemies of mankind. Due to the very nature of release of energy, damage is evident which, however, will not culminate in a disaster unless it strikes a populated area. The word mitigation may be defined as the reduction in severity of something. The Earthquake disaster mitigation, therefore, implies that such measures may be taken which help reduce severity of damage caused by earthquake to life, property and environment. While “earthquake disaster mitigation” usually refers primarily to interventions to strengthen the built environment, and “earthquake protection” is now considered to include human, social and administrative aspects of reducing earthquake effects. It should, however, be noted that reduction of earthquake hazards through prediction is considered to be the one of the effective measures, and much effort is spent on prediction strategies. While earthquake prediction does not guarantee safety and even if predicted correctly the damage to life and property on such a large scale warrants the use of other aspects of mitigation. While earthquake prediction may be of some help, mitigation remains the main focus of attention of the civil society. Present study suggests that anomalous seismic activity/ earthquake swarm existed prior to the medium size earthquakes in the Nepal Himalaya. The mainshocks were preceded by the quiescence period which is an indication for the occurrence of future seismic activity. In all the cases, the identified episodes of anomalous seismic activity were
NASA Astrophysics Data System (ADS)
2002-09-01
Contents include the following: Deep Electromagnetic Images of Seismogenic Zone of the Chi-Chi (Taiwan) Earthquake; New Techniques for Stress-Forecasting Earthquakes; Aspects of Characteristics of Near-Fault Ground Motions of the 1999 Chi-Chi (Taiwan) Earthquake; Liquefaction Damage and Related Remediation in Wufeng after the Chi-Chi Earthquake; Fines Content Effects on Liquefaction Potential Evaluation for Sites Liquefied during Chi-Chi Earthquake 1999; Damage Investigation and Liquefaction Potential Analysis of Gravelly Soil; Dynamic Characteristics of Soils in Yuan-Lin Liquefaction Area; A Preliminary Study of Earthquake Building Damage and Life Loss Due to the Chi-Chi Earthquake; Statistical Analyses of Relation between Mortality and Building Type in the 1999 Chi-Chi Earthquake; Development of an After Earthquake Disaster Shelter Evaluation Model; Posttraumatic Stress Reactions in Children and Adolescents One Year after the 1999 Taiwan Chi-Chi Earthquake; Changes or Not is the Question: the Meaning of Posttraumatic Stress Reactions One Year after the Taiwan Chi-Chi Earthquake.
The Application of Speaker Recognition Techniques in the Detection of Tsunamigenic Earthquakes
NASA Astrophysics Data System (ADS)
Gorbatov, A.; O'Connell, J.; Paliwal, K.
2015-12-01
Tsunami warning procedures adopted by national tsunami warning centres largely rely on the classical approach of earthquake location, magnitude determination, and the consequent modelling of tsunami waves. Although this approach is based on known physics theories of earthquake and tsunami generation processes, this may be the main shortcoming due to the need to satisfy minimum seismic data requirement to estimate those physical parameters. At least four seismic stations are necessary to locate the earthquake and a minimum of approximately 10 minutes of seismic waveform observation to reliably estimate the magnitude of a large earthquake similar to the 2004 Indian Ocean Tsunami Earthquake of M9.2. Consequently the total time to tsunami warning could be more than half an hour. In attempt to reduce the time of tsunami alert a new approach is proposed based on the classification of tsunamigenic and non tsunamigenic earthquakes using speaker recognition techniques. A Tsunamigenic Dataset (TGDS) was compiled to promote the development of machine learning techniques for application to seismic trace analysis and, in particular, tsunamigenic event detection, and compare them to existing seismological methods. The TGDS contains 227 off shore events (87 tsunamigenic and 140 non-tsunamigenic earthquakes with M≥6) from Jan 2000 to Dec 2011, inclusive. A Support Vector Machine classifier using a radial-basis function kernel was applied to spectral features derived from 400 sec frames of 3-comp. 1-Hz broadband seismometer data. Ten-fold cross-validation was used during training to choose classifier parameters. Voting was applied to the classifier predictions provided from each station to form an overall prediction for an event. The F1 score (harmonic mean of precision and recall) was chosen to rate each classifier as it provides a compromise between type-I and type-II errors, and due to the imbalance between the representative number of events in the tsunamigenic and non
The Alaska earthquake, March 27, 1964: lessons and conclusions
Eckel, Edwin B.
1970-01-01
One of the greatest earthquakes of all time struck south-central Alaska on March 27, 1964. Strong motion lasted longer than for most recorded earthquakes, and more land surface was dislocated, vertically and horizontally, than by any known previous temblor. Never before were so many effects on earth processes and on the works of man available for study by scientists and engineers over so great an area. The seismic vibrations, which directly or indirectly caused most of the damage, were but surface manifestations of a great geologic event-the dislocation of a huge segment of the crust along a deeply buried fault whose nature and even exact location are still subjects for speculation. Not only was the land surface tilted by the great tectonic event beneath it, with resultant seismic sea waves that traversed the entire Pacific, but an enormous mass of land and sea floor moved several tens of feet horizontally toward the Gulf of Alaska. Downslope mass movements of rock, earth, and snow were initiated. Subaqueous slides along lake shores and seacoasts, near-horizontal movements of mobilized soil (“landspreading”), and giant translatory slides in sensitive clay did the most damage and provided the most new knowledge as to the origin, mechanics, and possible means of control or avoidance of such movements. The slopes of most of the deltas that slid in 1964, and that produced destructive local waves, are still as steep or steeper than they were before the earthquake and hence would be unstable or metastable in the event of another great earthquake. Rockslide avalanches provided new evidence that such masses may travel on cushions of compressed air, but a widely held theory that glaciers surge after an earthquake has not been substantiated. Innumerable ground fissures, many of them marked by copious emissions of water, caused much damage in towns and along transportation routes. Vibration also consolidated loose granular materials. In some coastal areas, local
Seismicity map tools for earthquake studies
NASA Astrophysics Data System (ADS)
Boucouvalas, Anthony; Kaskebes, Athanasios; Tselikas, Nikos
2014-05-01
We report on the development of new and online set of tools for use within Google Maps, for earthquake research. We demonstrate this server based and online platform (developped with PHP, Javascript, MySQL) with the new tools using a database system with earthquake data. The platform allows us to carry out statistical and deterministic analysis on earthquake data use of Google Maps and plot various seismicity graphs. The tool box has been extended to draw on the map line segments, multiple straight lines horizontally and vertically as well as multiple circles, including geodesic lines. The application is demonstrated using localized seismic data from the geographic region of Greece as well as other global earthquake data. The application also offers regional segmentation (NxN) which allows the studying earthquake clustering, and earthquake cluster shift within the segments in space. The platform offers many filters such for plotting selected magnitude ranges or time periods. The plotting facility allows statistically based plots such as cumulative earthquake magnitude plots and earthquake magnitude histograms, calculation of 'b' etc. What is novel for the platform is the additional deterministic tools. Using the newly developed horizontal and vertical line and circle tools we have studied the spatial distribution trends of many earthquakes and we here show for the first time the link between Fibonacci Numbers and spatiotemporal location of some earthquakes. The new tools are valuable for examining visualizing trends in earthquake research as it allows calculation of statistics as well as deterministic precursors. We plan to show many new results based on our newly developed platform.
Practical Applications for Earthquake Scenarios Using ShakeMap
NASA Astrophysics Data System (ADS)
Wald, D. J.; Worden, B.; Quitoriano, V.; Goltz, J.
2001-12-01
In planning and coordinating emergency response, utilities, local government, and other organizations are best served by conducting training exercises based on realistic earthquake situations-ones that they are most likely to face. Scenario earthquakes can fill this role; they can be generated for any geologically plausible earthquake or for actual historic earthquakes. ShakeMap Web pages now display selected earthquake scenarios (www.trinet.org/shake/archive/scenario/html) and more events will be added as they are requested and produced. We will discuss the methodology and provide practical examples where these scenarios are used directly for risk reduction. Given a selected event, we have developed tools to make it relatively easy to generate a ShakeMap earthquake scenario using the following steps: 1) Assume a particular fault or fault segment will (or did) rupture over a certain length, 2) Determine the magnitude of the earthquake based on assumed rupture dimensions, 3) Estimate the ground shaking at all locations in the chosen area around the fault, and 4) Represent these motions visually by producing ShakeMaps and generating ground motion input for loss estimation modeling (e.g., FEMA's HAZUS). At present, ground motions are estimated using empirical attenuation relationships to estimate peak ground motions on rock conditions. We then correct the amplitude at that location based on the local site soil (NEHRP) conditions as we do in the general ShakeMap interpolation scheme. Finiteness is included explicitly, but directivity enters only through the empirical relations. Although current ShakeMap earthquake scenarios are empirically based, substantial improvements in numerical ground motion modeling have been made in recent years. However, loss estimation tools, HAZUS for example, typically require relatively high frequency (3 Hz) input for predicting losses, above the range of frequencies successfully modeled to date. Achieving full-synthetic ground motion
Gori, Paula L.
1993-01-01
INTERACTIVE WORKSHOPS: ESSENTIAL ELEMENTS OF THE EARTHQUAKE HAZARDS RESEARCH AND REDUCTION PROGRAM IN THE WASATCH FRONT, UTAH: Interactive workshops provided the forum and stimulus necessary to foster collaboration among the participants in the multidisciplinary, 5-yr program of earthquake hazards reduction in the Wasatch Front, Utah. The workshop process validated well-documented social science theories on the importance of interpersonal interaction, including interaction between researchers and users of research to increase the probability that research will be relevant to the user's needs and, therefore, more readily used. REDUCING EARTHQUAKE HAZARDS IN UTAH: THE CRUCIAL CONNECTION BETWEEN RESEARCHERS AND PRACTITIONERS: Complex scientific and engineering studies must be translated for and transferred to nontechnical personnel for use in reducing earthquake hazards in Utah. The three elements needed for effective translation, likelihood of occurrence, location, and severity of potential hazards, and the three elements needed for effective transfer, delivery, assistance, and encouragement, are described and illustrated for Utah. The importance of evaluating and revising earthquake hazard reduction programs and their components is emphasized. More than 30 evaluations of various natural hazard reduction programs and techniques are introduced. This report was prepared for research managers, funding sources, and evaluators of the Utah earthquake hazard reduction program who are concerned about effectiveness. An overview of the Utah program is provided for those researchers, engineers, planners, and decisionmakers, both public and private, who are committed to reducing human casualties, property damage, and interruptions of socioeconomic systems. PUBLIC PERCEPTIONS OF THE IMPLEMENTATION OF EARTHQUAKE MITIGATION POLICIES ALONG THE WASATCH FRONT IN UTAH: The earthquake hazard potential along the Wasatch Front in Utah has been well defined by a number of scientific and
Thermal IR satellite data application for earthquake research in Pakistan
NASA Astrophysics Data System (ADS)
Barkat, Adnan; Ali, Aamir; Rehman, Khaista; Awais, Muhammad; Riaz, Muhammad Shahid; Iqbal, Talat
2018-05-01
The scientific progress in space research indicates earthquake-related processes of surface temperature growth, gas/aerosol exhalation and electromagnetic disturbances in the ionosphere prior to seismic activity. Among them surface temperature growth calculated using the satellite thermal infrared images carries valuable earthquake precursory information for near/distant earthquakes. Previous studies have concluded that such information can appear few days before the occurrence of an earthquake. The objective of this study is to use MODIS thermal imagery data for precursory analysis of Kashmir (Oct 8, 2005; Mw 7.6; 26 km), Ziarat (Oct 28, 2008; Mw 6.4; 13 km) and Dalbandin (Jan 18, 2011; Mw 7.2; 69 km) earthquakes. Our results suggest that there exists an evident correlation of Land Surface Temperature (thermal; LST) anomalies with seismic activity. In particular, a rise of 3-10 °C in LST is observed 6, 4 and 14 days prior to Kashmir, Ziarat and Dalbandin earthquakes. In order to further elaborate our findings, we have presented a comparative and percentile analysis of daily and five years averaged LST for a selected time window with respect to the month of earthquake occurrence. Our comparative analyses of daily and five years averaged LST show a significant change of 6.5-7.9 °C for Kashmir, 8.0-8.1 °C for Ziarat and 2.7-5.4 °C for Dalbandin earthquakes. This significant change has high percentile values for the selected events i.e. 70-100% for Kashmir, 87-100% for Ziarat and 84-100% for Dalbandin earthquakes. We expect that such consistent results may help in devising an optimal earthquake forecasting strategy and to mitigate the effect of associated seismic hazards.
Rapid Earthquake Magnitude Estimation for Early Warning Applications
NASA Astrophysics Data System (ADS)
Goldberg, Dara; Bock, Yehuda; Melgar, Diego
2017-04-01
Earthquake magnitude is a concise metric that provides invaluable information about the destructive potential of a seismic event. Rapid estimation of magnitude for earthquake and tsunami early warning purposes requires reliance on near-field instrumentation. For large magnitude events, ground motions can exceed the dynamic range of near-field broadband seismic instrumentation (clipping). Strong motion accelerometers are designed with low gains to better capture strong shaking. Estimating earthquake magnitude rapidly from near-source strong-motion data requires integration of acceleration waveforms to displacement. However, integration amplifies small errors, creating unphysical drift that must be eliminated with a high pass filter. The loss of the long period information due to filtering is an impediment to magnitude estimation in real-time; the relation between ground motion measured with strong-motion instrumentation and magnitude saturates, leading to underestimation of earthquake magnitude. Using station displacements from Global Navigation Satellite System (GNSS) observations, we can supplement the high frequency information recorded by traditional seismic systems with long-period observations to better inform rapid response. Unlike seismic-only instrumentation, ground motions measured with GNSS scale with magnitude without saturation [Crowell et al., 2013; Melgar et al., 2015]. We refine the current magnitude scaling relations using peak ground displacement (PGD) by adding a large GNSS dataset of earthquakes in Japan. Because it does not suffer from saturation, GNSS alone has significant advantages over seismic-only instrumentation for rapid magnitude estimation of large events. The earthquake's magnitude can be estimated within 2-3 minutes of earthquake onset time [Melgar et al., 2013]. We demonstrate that seismogeodesy, the optimal combination of GNSS and seismic data at collocated stations, provides the added benefit of improving the sensitivity of
Imaging Strategies for Tissue Engineering Applications
Nam, Seung Yun; Ricles, Laura M.; Suggs, Laura J.
2015-01-01
Tissue engineering has evolved with multifaceted research being conducted using advanced technologies, and it is progressing toward clinical applications. As tissue engineering technology significantly advances, it proceeds toward increasing sophistication, including nanoscale strategies for material construction and synergetic methods for combining with cells, growth factors, or other macromolecules. Therefore, to assess advanced tissue-engineered constructs, tissue engineers need versatile imaging methods capable of monitoring not only morphological but also functional and molecular information. However, there is no single imaging modality that is suitable for all tissue-engineered constructs. Each imaging method has its own range of applications and provides information based on the specific properties of the imaging technique. Therefore, according to the requirements of the tissue engineering studies, the most appropriate tool should be selected among a variety of imaging modalities. The goal of this review article is to describe available biomedical imaging methods to assess tissue engineering applications and to provide tissue engineers with criteria and insights for determining the best imaging strategies. Commonly used biomedical imaging modalities, including X-ray and computed tomography, positron emission tomography and single photon emission computed tomography, magnetic resonance imaging, ultrasound imaging, optical imaging, and emerging techniques and multimodal imaging, will be discussed, focusing on the latest trends of their applications in recent tissue engineering studies. PMID:25012069
NASA Astrophysics Data System (ADS)
Weatherill, G. A.; Pagani, M.; Garcia, J.
2016-09-01
The creation of a magnitude-homogenized catalogue is often one of the most fundamental steps in seismic hazard analysis. The process of homogenizing multiple catalogues of earthquakes into a single unified catalogue typically requires careful appraisal of available bulletins, identification of common events within multiple bulletins and the development and application of empirical models to convert from each catalogue's native scale into the required target. The database of the International Seismological Center (ISC) provides the most exhaustive compilation of records from local bulletins, in addition to its reviewed global bulletin. New open-source tools are developed that can utilize this, or any other compiled database, to explore the relations between earthquake solutions provided by different recording networks, and to build and apply empirical models in order to harmonize magnitude scales for the purpose of creating magnitude-homogeneous earthquake catalogues. These tools are described and their application illustrated in two different contexts. The first is a simple application in the Sub-Saharan Africa region where the spatial coverage and magnitude scales for different local recording networks are compared, and their relation to global magnitude scales explored. In the second application the tools are used on a global scale for the purpose of creating an extended magnitude-homogeneous global earthquake catalogue. Several existing high-quality earthquake databases, such as the ISC-GEM and the ISC Reviewed Bulletins, are harmonized into moment magnitude to form a catalogue of more than 562 840 events. This extended catalogue, while not an appropriate substitute for a locally calibrated analysis, can help in studying global patterns in seismicity and hazard, and is therefore released with the accompanying software.
Earthquake alarm; operating the seismograph station at the University of California, Berkeley.
Stump, B.
1980-01-01
At the University of California seismographic stations, the task of locating and determining magnitudes for both local and distant earthquakes is a continuous one. Teleseisms must be located rapidly so that events that occur in the Pacific can be identified and the Pacific Tsunami Warning System alerted. For great earthquakes anywhere, there is a responsibility to notify public agencies such as the California Office of Emergency Services, the Federal Disaster Assistance Administration, the Earthquake Engineering Research Institute, the California Seismic Safety Commission, and the American Red Cross. In the case of damaging local earthquakes, it is necessary to alert also the California Department of Water Resources, California Division of Mines and Geology, U.S Army Corps of Engineers, Federal Bureau of Reclamation, and the Bay Area Rapid Transit. These days, any earthquakes that are felt in northern California cause immediate inquiries from the news media and an interested public. The series of earthquakes that jolted the Livermore area from January 24 to 26 1980, is a good case in point.
NASA Astrophysics Data System (ADS)
Bejar, M.; Alvarez Gomez, J. A.; Staller, A.; Luna, M. P.; Perez Lopez, R.; Monserrat, O.; Chunga, K.; Herrera, G.; Jordá, L.; Lima, A.; Martínez-Díaz, J. J.
2017-12-01
It has long been recognized that earthquakes change the stress in the upper crust around the fault rupture and can influence the short-term behaviour of neighbouring faults and volcanoes. Rapid estimates of these stress changes can provide the authorities managing the post-disaster situation with a useful tool to identify and monitor potential threads and to update the estimates of seismic and volcanic hazard in a region. Space geodesy is now routinely used following an earthquake to image the displacement of the ground and estimate the rupture geometry and the distribution of slip. Using the obtained source model, it is possible to evaluate the remaining moment deficit and to infer the stress changes on nearby faults and volcanoes produced by the earthquake, which can be used to identify which faults and volcanoes are brought closer to failure or activation. Although these procedures are commonly used today, the transference of these results to the authorities managing the post-disaster situation is not straightforward and thus its usefulness is reduced in practice. Here we propose a methodology to evaluate the potential influence of an earthquake on nearby faults and volcanoes and create easy-to-understand maps for decision-making support after an earthquake. We apply this methodology to the Mw 7.8, 2016 Ecuador earthquake. Using Sentinel-1 SAR and continuous GPS data, we measure the coseismic ground deformation and estimate the distribution of slip. Then we use this model to evaluate the moment deficit on the subduction interface and changes of stress on the surrounding faults and volcanoes. The results are compared with the seismic and volcanic events that have occurred after the earthquake. We discuss potential and limits of the methodology and the lessons learnt from discussion with local authorities.
Yehle, Lynn A.
1974-01-01
A program to study the engineering geology of most of the larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about Sitka and vicinity is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are subject to revision as further information becomes available. This report can provide broad geologic guidelines for planners and engineers during preparation of land-use plans. The use of this information should lead to minimizing future loss of life and property due to geologic hazards, especially during very large earthquakes. Landscape of Sitka and surrounding area is characterized by numerous islands and a narrow strip of gently rolling ground adjacent to rugged mountains; steep valleys and some fiords cut sharply into the mountains. A few valley floors are wide and flat and grade into moderate-sized deltas. Glaciers throughout southeastern Alaska and elsewhere became vastly enlarged during the Pleistocene Epoch. The Sitka area presumably was covered by ice several times; glaciers deeply eroded some valleys and removed fractured bedrock along some faults. The last major deglaciation occurred sometime before 10,000 years ago. Crustal rebound believed to be related to glacial melting caused land emergence at Sitka of at least 35 feet (10.7 m) relative to present sea level. Bedrock at Sitka and vicinity is composed mostly of bedded, hard, dense graywacke and some argillite. Beds strike predominantly northwest and are vertical or steeply dipping. Locally, bedded rocks are cut by dikes of fine-grained igneous rock. Host bedrock is of Jurassic and Cretaceous age. Eight types of surficial deposits of Quaternary age were recognized. Below altitudes of 3S feet (10.7 m), the dominant deposits are those of modern and elevated shores and deltas; at higher altitudes, widespread muskeg overlies a mantle of
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hafen, D.; Kintzer, F.C.
1977-11-01
The correlation between ground motion and building damage was investigated for the San Fernando earthquake of 1971. A series of iso-intensity maps was compiled to summarize the ground motion in terms of the Blume Engineering Intensity Scale (EIS). This involved the analysis of ground motion records from 62 stations in the Los Angeles area. Damage information for low-rise buildings was obtained in the form of records of loans granted by the Small Business Administration to repair earthquake damage. High-rise damage evaluations were based on direct inquiry and building inspection. Damage factors (ratio of damage repair cost to building value) weremore » calculated and summarized on contour maps. A statistical study was then undertaken to determine relationships between ground motion and damage factor. Several parameters for ground motion were considered and evaluated by means of correlation coefficients.« less
Ceramic applications in turbine engines
NASA Technical Reports Server (NTRS)
Helms, H. E.; Heitman, P. W.; Lindgren, L. C.; Thrasher, S. R.
1984-01-01
The application of ceramic components to demonstrate improved cycle efficiency by raising the operating temperature of the existing Allison IGI 404 vehicular gas turbine engine is discussed. This effort was called the Ceramic Applications in Turbine Engines (CATE) program and has successfully demonstrated ceramic components. Among these components are two design configurations featuring stationary and rotating caramic components in the IGT 404 engine. A complete discussion of all phases of the program, design, materials development, fabrication of ceramic components, and testing-including rig, engine, and vehicle demonstation test are presented. During the CATE program, a ceramic technology base was established that is now being applied to automotive and other gas turbine engine programs. This technology base is outlined and also provides a description of the CATE program accomplishments.
RICHTER: A Smartphone Application for Rapid Collection of Geo-Tagged Pictures of Earthquake Damage
NASA Astrophysics Data System (ADS)
Skinnemoen, H.; Bossu, R.; Furuheim, K.; Bjorgo, E.
2010-12-01
RICHTER (Rapid geo-Images for Collaborative Help Targeting Earthquake Response) is a smartphone version of a professional application developed to provide high quality geo-tagged image communication over challenging network links, such as satellites and poor mobile links. Developed for Android mobile phones, it allows eyewitnesses to share their pictures of earthquake damage easily and without cost with the Euro-Mediterranean Seismological Centre (EMSC). The goal is to engage citizens in the collection of the most up-to-date visual information on local damage for improved rapid impact assessment. RICHTER integrates the innovative and award winning ASIGN protocol initially developed for satellite communication between cameras / computers / satcom terminals and servers at HQ. ASIGN is a robust and optimal image and video communication management solution for bandwidth-limited communication networks which was developed for use particularly in emergency and disaster situations. Contrary to a simple Multimedia Messaging System (MMS), RICHTER allows access to high definition images with embedded location information. Location is automatically assigned from either the internal GPS, derived from the mobile network (triangulation) or the current Wi-Fi domain, in that order, as this corresponds to the expected positioning accuracy. Pictures are compressed to 20-30KB of data typically for fast transfer and to avoid network overload. Full size images can be requested by the EMSC either fully automatically, or on a case-by-case basis, depending on the user preferences. ASIGN was initially developed in coordination with INMARSAT and the European Space Agency. It was used by the Rapid Mapping Unit of the United Nations notably for the damage assessment of the January 12, 2010 Haiti earthquake where more than 700 photos were collected. RICHTER will be freely distributed on the EMSC website to eyewitnesses in the event of significantly damaging earthquakes. The EMSC is the second
Engineering Lessons Learned and Systems Engineering Applications
NASA Technical Reports Server (NTRS)
Gill, Paul S.; Garcia, Danny; Vaughan, William W.
2005-01-01
Systems Engineering is fundamental to good engineering, which in turn depends on the integration and application of engineering lessons learned and technical standards. Thus, good Systems Engineering also depends on systems engineering lessons learned from within the aerospace industry being documented and applied. About ten percent of the engineering lessons learned documented in the NASA Lessons Learned Information System are directly related to Systems Engineering. A key issue associated with lessons learned datasets is the communication and incorporation of this information into engineering processes. Systems Engineering has been defined (EINIS-632) as "an interdisciplinary approach encompassing the entire technical effort to evolve and verify an integrated and life-cycle balanced set of system people, product, and process solutions that satisfy customer needs". Designing reliable space-based systems has always been a goal for NASA, and many painful lessons have been learned along the way. One of the continuing functions of a system engineer is to compile development and operations "lessons learned" documents and ensure their integration into future systems development activities. They can produce insights and information for risk identification identification and characterization. on a new project. Lessons learned files from previous projects are especially valuable in risk
1987-09-01
Geological Survey, MS977, Menlo Park , CA 94025, USA. , TURKISH NATIONAL COMMITTEE FOR EARTHQUAKE ENGINEERING THIRTEENTH REGIONAL SEMINALR ON EARTQUAKE...this case the conditional probability P(E/F1) will also depend in general on t . A simple example of a case of this type was developed by the present...These studies took Into cosideration all the available date eoncerning the dynamic characteristics of different type * of buildings. A first attempt was
Application of Second-Moment Source Analysis to Three Problems in Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Donovan, J.; Jordan, T. H.
2011-12-01
probabilities? The FMT representation allows us to generalize the models typically used for this purpose (e.g., marked point process models, such as ETAS), which will again be necessary in operational earthquake forecasting. To quantify aftershock probabilities, we compare mainshock FMTs with the first and second spatial moments of weighted aftershock hypocenters. We will describe applications of these results to the Uniform California Earthquake Rupture Forecast, version 3, which is now under development by the Working Group on California Earthquake Probabilities.
Application and analysis of debris-flow early warning system in Wenchuan earthquake-affected area
NASA Astrophysics Data System (ADS)
Liu, D. L.; Zhang, S. J.; Yang, H. J.; Zhao, L. Q.; Jiang, Y. H.; Tang, D.; Leng, X. P.
2016-02-01
The activities of debris flow (DF) in the Wenchuan earthquake-affected area significantly increased after the earthquake on 12 May 2008. The safety of the lives and property of local people is threatened by DFs. A physics-based early warning system (EWS) for DF forecasting was developed and applied in this earthquake area. This paper introduces an application of the system in the Wenchuan earthquake-affected area and analyzes the prediction results via a comparison to the DF events triggered by the strong rainfall events reported by the local government. The prediction accuracy and efficiency was first compared with a contribution-factor-based system currently used by the weather bureau of Sichuan province. The storm on 17 August 2012 was used as a case study for this comparison. The comparison shows that the false negative rate and false positive rate of the new system is, respectively, 19 and 21 % lower than the system based on the contribution factors. Consequently, the prediction accuracy is obviously higher than the system based on the contribution factors with a higher operational efficiency. On the invitation of the weather bureau of Sichuan province, the authors upgraded their prediction system of DF by using this new system before the monsoon of Wenchuan earthquake-affected area in 2013. Two prediction cases on 9 July 2013 and 10 July 2014 were chosen to further demonstrate that the new EWS has high stability, efficiency, and prediction accuracy.
Earthquake Hazard and the Environmental Seismic Intensity (ESI) Scale
NASA Astrophysics Data System (ADS)
Serva, Leonello; Vittori, Eutizio; Comerci, Valerio; Esposito, Eliana; Guerrieri, Luca; Michetti, Alessandro Maria; Mohammadioun, Bagher; Mohammadioun, Georgianna C.; Porfido, Sabina; Tatevossian, Ruben E.
2016-05-01
The main objective of this paper was to introduce the Environmental Seismic Intensity scale (ESI), a new scale developed and tested by an interdisciplinary group of scientists (geologists, geophysicists and seismologists) in the frame of the International Union for Quaternary Research (INQUA) activities, to the widest community of earth scientists and engineers dealing with seismic hazard assessment. This scale defines earthquake intensity by taking into consideration the occurrence, size and areal distribution of earthquake environmental effects (EEE), including surface faulting, tectonic uplift and subsidence, landslides, rock falls, liquefaction, ground collapse and tsunami waves. Indeed, EEEs can significantly improve the evaluation of seismic intensity, which still remains a critical parameter for a realistic seismic hazard assessment, allowing to compare historical and modern earthquakes. Moreover, as shown by recent moderate to large earthquakes, geological effects often cause severe damage"; therefore, their consideration in the earthquake risk scenario is crucial for all stakeholders, especially urban planners, geotechnical and structural engineers, hazard analysts, civil protection agencies and insurance companies. The paper describes background and construction principles of the scale and presents some case studies in different continents and tectonic settings to illustrate its relevant benefits. ESI is normally used together with traditional intensity scales, which, unfortunately, tend to saturate in the highest degrees. In this case and in unpopulated areas, ESI offers a unique way for assessing a reliable earthquake intensity. Finally, yet importantly, the ESI scale also provides a very convenient guideline for the survey of EEEs in earthquake-stricken areas, ensuring they are catalogued in a complete and homogeneous manner.
Earthquake Prediction in a Big Data World
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.
2016-12-01
The digital revolution started just about 15 years ago has already surpassed the global information storage capacity of more than 5000 Exabytes (in optimally compressed bytes) per year. Open data in a Big Data World provides unprecedented opportunities for enhancing studies of the Earth System. However, it also opens wide avenues for deceptive associations in inter- and transdisciplinary data and for inflicted misleading predictions based on so-called "precursors". Earthquake prediction is not an easy task that implies a delicate application of statistics. So far, none of the proposed short-term precursory signals showed sufficient evidence to be used as a reliable precursor of catastrophic earthquakes. Regretfully, in many cases of seismic hazard assessment (SHA), from term-less to time-dependent (probabilistic PSHA or deterministic DSHA), and short-term earthquake forecasting (StEF), the claims of a high potential of the method are based on a flawed application of statistics and, therefore, are hardly suitable for communication to decision makers. Self-testing must be done in advance claiming prediction of hazardous areas and/or times. The necessity and possibility of applying simple tools of Earthquake Prediction Strategies, in particular, Error Diagram, introduced by G.M. Molchan in early 1990ies, and Seismic Roulette null-hypothesis as a metric of the alerted space, is evident. The set of errors, i.e. the rates of failure and of the alerted space-time volume, can be easily compared to random guessing, which comparison permits evaluating the SHA method effectiveness and determining the optimal choice of parameters in regard to a given cost-benefit function. These and other information obtained in such a simple testing may supply us with a realistic estimates of confidence and accuracy of SHA predictions and, if reliable but not necessarily perfect, with related recommendations on the level of risks for decision making in regard to engineering design, insurance
NASA Astrophysics Data System (ADS)
Kázmér, Miklós; Major, Balázs; Hariyadi, Agus; Pramumijoyo, Subagyo; Ditto Haryana, Yohanes
2010-05-01
Earthquakes are among the most horrible events of nature due to unexpected occurrence, for which no spiritual means are available for protection. The only way of preserving life and property is applying earthquake-resistant construction methods. Ancient Greek architects of public buildings applied steel clamps embedded in lead casing to hold together columns and masonry walls during frequent earthquakes in the Aegean region. Elastic steel provided strength, while plastic lead casing absorbed minor shifts of blocks without fracturing rigid stone. Romans invented concrete and built all sizes of buildings as a single, unflexible unit. Masonry surrounding and decorating concrete core of the wall did not bear load. Concrete resisted minor shaking, yielding only to forces higher than fracture limits. Roman building traditions survived the Dark Ages and 12th century Crusader castles erected in earthquake-prone Syria survive until today in reasonably good condition. Concrete and steel clamping persisted side-by-side in the Roman Empire. Concrete was used for cheap construction as compared to building of masonry. Applying lead-encased steel increased costs, and was avoided whenever possible. Columns of the various forums in Italian Pompeii mostly lack steel fittings despite situated in well-known earthquake-prone area. Whether frequent recurrence of earthquakes in the Naples region was known to inhabitants of Pompeii might be a matter of debate. Seemingly the shock of the AD 62 earthquake was not enough to apply well-known protective engineering methods throughout the reconstruction of the city before the AD 79 volcanic catastrophe. An independent engineering tradition developed on the island of Java (Indonesia). The mortar-less construction technique of 8-9th century Hindu masonry shrines around Yogyakarta would allow scattering of blocks during earthquakes. To prevent dilapidation an intricate mortise-and-tenon system was carved into adjacent faces of blocks. Only the
NASA Astrophysics Data System (ADS)
Yang, Kun; Xu, Quan-li; Peng, Shuang-yun; Cao, Yan-bo
2008-10-01
Based on the necessity analysis of GIS applications in earthquake disaster prevention, this paper has deeply discussed the spatial integration scheme of urban earthquake disaster loss evaluation models and visualization technologies by using the network development methods such as COM/DCOM, ActiveX and ASP, as well as the spatial database development methods such as OO4O and ArcSDE based on ArcGIS software packages. Meanwhile, according to Software Engineering principles, a solution of Urban Earthquake Emergency Response Decision Support Systems based on GIS technologies have also been proposed, which include the systems logical structures, the technical routes,the system realization methods and function structures etc. Finally, the testing systems user interfaces have also been offered in the paper.
The Extended Concept Of Symmetropy And Its Application To Earthquakes And Acoustic Emissions
NASA Astrophysics Data System (ADS)
Nanjo, K.; Yodogawa, E.
2003-12-01
There is the notion of symmetropy that can be considered as a powerful tool to measure quantitatively entropic heterogeneity regarding symmetry of a pattern. It can be regarded as a quantitative measure to extract the feature of asymmetry of a pattern (Yodogawa, 1982; Nanjo et al., 2000, 2001, 2002 in press). In previous studies, symmetropy was estimated for the spatial distributions of acoustic emissions generated before the ultimate whole fracture of a rock specimen in the laboratory experiment and for the spatial distributions of earthquakes in the seismic source model with self-organized criticality (SOC). In each of these estimations, the outline of the region in which symmetropy is estimated for a pattern is determined to be equal to that of the rock specimen in which acoustic emissions are generated or that of the SOC seismic source model from which earthquakes emerge. When local seismicities like aftershocks, foreshocks and earthquake swarms in the Earth's crust are considered, it is difficult to determine objectively the outline of the region characterizing these local seismicities without the need of subjectiveness. So, the original concept of symmetropy is not appropriate to be directly applied to such local seismicities and the proper modification of the original one is needed. Here, we introduce the notion of symmetropy for the nonlinear geosciences and extend it for the purpose of the application to local seismicities such as aftershocks, foreshocks and earthquake swarms. We employ the extended concept to the spatial distributions of acoustic emissions generated in a previous laboratory experiment where the failure process in a brittle granite sample can be stabilized by controlling axial stress to maintain a constant rate of acoustic emissions and, as a result, detailed view of fracture nucleation and growth was observed. Moreover, it is applied to the temporal variations of spatial distributions of aftershocks and foreshocks of the main shocks
Earthquake Risk Mitigation in the Tokyo Metropolitan area
NASA Astrophysics Data System (ADS)
Hirata, N.; Sakai, S.; Kasahara, K.; Nakagawa, S.; Nanjo, K.; Panayotopoulos, Y.; Tsuruoka, H.
2010-12-01
Seismic disaster risk mitigation in urban areas constitutes a challenge through collaboration of scientific, engineering, and social-science fields. Examples of collaborative efforts include research on detailed plate structure with identification of all significant faults, developing dense seismic networks; strong ground motion prediction, which uses information on near-surface seismic site effects and fault models; earthquake resistant and proof structures; and cross-discipline infrastructure for effective risk mitigation just after catastrophic events. Risk mitigation strategy for the next greater earthquake caused by the Philippine Sea plate (PSP) subducting beneath the Tokyo metropolitan area is of major concern because it caused past mega-thrust earthquakes, such as the 1703 Genroku earthquake (magnitude M8.0) and the 1923 Kanto earthquake (M7.9) which had 105,000 fatalities. A M7 or greater (M7+) earthquake in this area at present has high potential to produce devastating loss of life and property with even greater global economic repercussions. The Central Disaster Management Council of Japan estimates that the M7+ earthquake will cause 11,000 fatalities and 112 trillion yen (about 1 trillion US$) economic loss. This earthquake is evaluated to occur with a probability of 70% in 30 years by the Earthquake Research Committee of Japan. In order to mitigate disaster for greater Tokyo, the Special Project for Earthquake Disaster Mitigation in the Tokyo Metropolitan Area (2007-2011) was launched in collaboration with scientists, engineers, and social-scientists in nationwide institutions. The results that are obtained in the respective fields will be integrated until project termination to improve information on the strategy assessment for seismic risk mitigation in the Tokyo metropolitan area. In this talk, we give an outline of our project as an example of collaborative research on earthquake risk mitigation. Discussion is extended to our effort in progress and
Earthquake Hazard Assessment: Basics of Evaluation
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2016-04-01
Seismic hazard assessment (SHA) is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Earthquakes follow the Unified Scaling Law that generalizes the Gutenberg-Richter relationship by taking into account naturally fractal distribution of their sources. Moreover, earthquakes, including the great and mega events, are clustered in time and their sequences have irregular recurrence intervals. Furthermore, earthquake related observations are limited to the recent most decades (or centuries in just a few rare cases). Evidently, all this complicates reliable assessment of seismic hazard and associated risks. Making SHA claims, either termless or time dependent (so-called t-DASH), quantitatively probabilistic in the frames of the most popular objectivists' viewpoint on probability requires a long series of "yes/no" trials, which cannot be obtained without an extended rigorous testing of the method predictions against real observations. Therefore, we reiterate the necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing supplies us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified in brief with a few examples, which analyses in more detail are given in a poster of
Bolton, Patricia A.
1993-01-01
Major earthquakes provide seismologists and engineers an opportunity to examine the performance of the Earth and the man-made structures in response to the forces of the quake. So, too, do they provide social scientists an opportunity to delve into human responses evoked by the ground shaking and its physical consequences. The findings from such research can serve to guide the development and application of programs and practices designed to reduce death, injury, property losses, and social disruption in subsequent earthquakes. This chapter contains findings from studies focused mainly on public response to the Loma Prieta earthquake; that is, on the behavior and perceptions of the general population rather than on the activities of specific organizations or on the impact on procedures or policies. A major feature of several of these studies is that the information was collected from the population throughout the Bay area, not just from persons in the most badly damaged communities or who had suffered the greatest losses. This wide range serves to provide comparisons of behavior for those most directly affected by the earthquake with others who were less directly affected by it but still had to consider it very “close to home.”
NASA Astrophysics Data System (ADS)
Delgado, José; García-Tortosa, Francisco J.; Garrido, Jesús; Giner, José; Lenti, Luca; López-Casado, Carlos; Martino, Salvatore; Peláez, José A.; Sanz de Galdeano, Carlos; Soler, Juan L.
2015-04-01
Landslides are a common ground effect induced by earthquakes of moderate to large magnitude. Most of them correspond to first-time instabilities induced by the seismic event, being the reactivation of pre-existing landslides less frequent in practice. The landslide of Güevejar (Granada province, S Spain) represents a case study of landslide that was reactivated, at least, two times by far field earthquakes: the Mw 8.7, 1755, Lisbon earthquake (with estimated epicentral distance of 680 km), and the Mw 6.5, 1884, Andalucia event (estimated epicentral distance of 45 km), but not by near field events of moderate magnitude (Mw < 6.0 and epicentral distances lower than 25 km). To study the seismic response of this landslide, a study has been conducted to elaborate an engineering-geological model. For this purpose, field work done included the elaboration of a detailed geological map (1:1000) of the landslide and surrounding areas, drilling of deep boreholes (80 m deep), down-hole measurement of both P and S wave velocities in the boreholes drilled, piezometric control of water table, MASW and ReMi profiles for determining the underlying structure of the sites tested (soil profile stratigraphy and the corresponding S-wave velocity of each soil level) and undisturbed sampling of the materials affected by the landslide. These samples were then tested in laboratory according to standard procedures for determination of both static (among which soil density, soil classification and shear strength) and dynamic properties (degradation curves for shear modulus and damping ratio with shear strain) of the landslide-involved materials. The model proposed corresponds to a complex landslide that combines a rototranslational mechanism with an earth-flow at its toe, which is characterized by a deep (> 50 m) sliding surface. The engineering-geological model constitutes the first step in an ongoing research devoted to understand how it could be reactivated during far field events. The
Wald, D.; Lin, K.-W.; Porter, K.; Turner, Loren
2008-01-01
When a potentially damaging earthquake occurs, utility and other lifeline managers, emergency responders, and other critical users have an urgent need for information about the impact on their particular facilities so they can make appropriate decisions and take quick actions to ensure safety and restore system functionality. ShakeMap, a tool used to portray the extent of potentially damaging shaking following an earthquake, on its own can be useful for emergency response, loss estimation, and public information. However, to take full advantage of the potential of ShakeMap, we introduce ShakeCast. ShakeCast facilitates the complicated assessment of potential damage to a user's widely distributed facilities by comparing the complex shaking distribution with the potentially highly variable damageability of their inventory to provide a simple, hierarchical list and maps of structures or facilities most likely impacted. ShakeCast is a freely available, post-earthquake situational awareness application that automatically retrieves earthquake shaking data from ShakeMap, compares intensity measures against users' facilities, sends notifications of potential damage to responsible parties, and generates facility damage maps and other Web-based products for both public and private emergency managers and responders. ?? 2008, Earthquake Engineering Research Institute.
Using the USGS Seismic Risk Web Application to estimate aftershock damage
McGowan, Sean M.; Luco, Nicolas
2014-01-01
The U.S. Geological Survey (USGS) Engineering Risk Assessment Project has developed the Seismic Risk Web Application to combine earthquake hazard and structural fragility information in order to calculate the risk of earthquake damage to structures. Enabling users to incorporate their own hazard and fragility information into the calculations will make it possible to quantify (in near real-time) the risk of additional damage to structures caused by aftershocks following significant earthquakes. Results can quickly be shared with stakeholders to illustrate the impact of elevated ground motion hazard and earthquake-compromised structural integrity on the risk of damage during a short-term, post-earthquake time horizon.
Wald, D.J.; Jaiswal, K.S.; Marano, K.D.; Bausch, D.
2011-01-01
also be both specific (although allowably uncertain) and actionable. In this analysis, an attempt is made at both simple and intuitive color-coded alerting criteria; yet the necessary uncertainty measures by which one can gauge the likelihood for the alert to be over- or underestimated are preserved. The essence of the proposed impact scale and alerting is that actionable loss information is now available in the immediate aftermath of significant earthquakes worldwide on the basis of quantifiable loss estimates. Utilizing EIS, PAGER's rapid loss estimates can adequately recommend alert levels and suggest appropriate response protocols, despite the uncertainties; demanding or awaiting observations or loss estimates with a high level of accuracy may increase the losses. ?? 2011 American Society of Civil Engineers.
Estimating Casualties for Large Earthquakes Worldwide Using an Empirical Approach
Jaiswal, Kishor; Wald, David J.; Hearne, Mike
2009-01-01
We developed an empirical country- and region-specific earthquake vulnerability model to be used as a candidate for post-earthquake fatality estimation by the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) system. The earthquake fatality rate is based on past fatal earthquakes (earthquakes causing one or more deaths) in individual countries where at least four fatal earthquakes occurred during the catalog period (since 1973). Because only a few dozen countries have experienced four or more fatal earthquakes since 1973, we propose a new global regionalization scheme based on idealization of countries that are expected to have similar susceptibility to future earthquake losses given the existing building stock, its vulnerability, and other socioeconomic characteristics. The fatality estimates obtained using an empirical country- or region-specific model will be used along with other selected engineering risk-based loss models for generation of automated earthquake alerts. These alerts could potentially benefit the rapid-earthquake-response agencies and governments for better response to reduce earthquake fatalities. Fatality estimates are also useful to stimulate earthquake preparedness planning and disaster mitigation. The proposed model has several advantages as compared with other candidate methods, and the country- or region-specific fatality rates can be readily updated when new data become available.
Southern California Earthquake Center (SCEC) Communication, Education and Outreach Program
NASA Astrophysics Data System (ADS)
Benthien, M. L.
2003-12-01
The SCEC Communication, Education, and Outreach Program (CEO) offers student research experiences, web-based education tools, classroom curricula, museum displays, public information brochures, online newsletters, and technical workshops and publications. This year, much progress has been made on the development of the Electronic Encyclopedia of Earthquakes (E3), a collaborative project with CUREE and IRIS. The E3 development system is now fully operational, and 165 entries are in the pipeline. When complete, information and resources for over 500 Earth science and engineering topics will be included, with connections to curricular materials useful for teaching Earth Science, engineering, physics and mathematics. To coordinate activities for the 10-year anniversary of the Northridge Earthquake in 2004 (and beyond), the "Earthquake Country Alliance" is being organized by SCEC CEO to present common messages, to share or promote existing resources, and to develop new activities and products jointly (such as a new version of Putting Down Roots in Earthquake Country). The group includes earthquake science and engineering researchers and practicing professionals, preparedness experts, response and recovery officials, news media representatives, and education specialists. A web portal, http://www.earthquakecountry.info, is being developed established with links to web pages and descriptions of other resources and services that the Alliance members provide. Another ongoing strength of SCEC is the Summer Intern program, which now has a year-round counterpart with students working on IT projects at USC. Since Fall 2002, over 32 students have participated in the program, including 7 students working with scientists throughout SCEC, 17 students involved in the USC "Earthquake Information Technology" intern program, and 7 students involved in CEO projects. These and other activities of the SCEC CEO program will be presented, along with lessons learned during program design and
Time-dependent earthquake forecasting: Method and application to the Italian region
NASA Astrophysics Data System (ADS)
Chan, C.; Sorensen, M. B.; Grünthal, G.; Hakimhashemi, A.; Heidbach, O.; Stromeyer, D.; Bosse, C.
2009-12-01
We develop a new approach for time-dependent earthquake forecasting and apply it to the Italian region. In our approach, the seismicity density is represented by a bandwidth function as a smoothing Kernel in the neighboring region of earthquakes. To consider the fault-interaction-based forecasting, we calculate the Coulomb stress change imparted by each earthquake in the study area. From this, the change of seismicity rate as a function of time can be estimated by the concept of rate-and-state stress transfer. We apply our approach to the region of Italy and earthquakes that occurred before 2003 to generate the seismicity density. To validate our approach, we compare our estimated seismicity density with the distribution of earthquakes with M≥3.8 after 2004. A positive correlation is found and all of the examined earthquakes locate in the area of the highest 66 percentile of seismicity density in the study region. Furthermore, the seismicity density corresponding to the epicenter of the 2009 April 6, Mw = 6.3, L’Aquila earthquake is in the area of the highest 5 percentile. For the time-dependent seismicity rate change, we estimate the rate-and-state stress transfer imparted by the M≥5.0 earthquakes occurred in the past 50 years. It suggests that the seismicity rate has increased at the locations of 65% of the examined earthquakes. Applying this approach to the L’Aquila sequence by considering seven M≥5.0 aftershocks as well as the main shock, not only spatial but also temporal forecasting of the aftershock distribution is significant.
Nowcasting Earthquakes and Tsunamis
NASA Astrophysics Data System (ADS)
Rundle, J. B.; Turcotte, D. L.
2017-12-01
The term "nowcasting" refers to the estimation of the current uncertain state of a dynamical system, whereas "forecasting" is a calculation of probabilities of future state(s). Nowcasting is a term that originated in economics and finance, referring to the process of determining the uncertain state of the economy or market indicators such as GDP at the current time by indirect means. We have applied this idea to seismically active regions, where the goal is to determine the current state of a system of faults, and its current level of progress through the earthquake cycle (http://onlinelibrary.wiley.com/doi/10.1002/2016EA000185/full). Advantages of our nowcasting method over forecasting models include: 1) Nowcasting is simply data analysis and does not involve a model having parameters that must be fit to data; 2) We use only earthquake catalog data which generally has known errors and characteristics; and 3) We use area-based analysis rather than fault-based analysis, meaning that the methods work equally well on land and in subduction zones. To use the nowcast method to estimate how far the fault system has progressed through the "cycle" of large recurring earthquakes, we use the global catalog of earthquakes, using "small" earthquakes to determine the level of hazard from "large" earthquakes in the region. We select a "small" region in which the nowcast is to be made, and compute the statistics of a much larger region around the small region. The statistics of the large region are then applied to the small region. For an application, we can define a small region around major global cities, for example a "small" circle of radius 150 km and a depth of 100 km, as well as a "large" earthquake magnitude, for example M6.0. The region of influence of such earthquakes is roughly 150 km radius x 100 km depth, which is the reason these values were selected. We can then compute and rank the seismic risk of the world's major cities in terms of their relative seismic risk
NASA Astrophysics Data System (ADS)
Klose, C. D.
2006-12-01
This presentation emphasizes the dualism of natural resources exploitation and economic growth versus geomechanical pollution and risks of human-triggered earthquakes. Large-scale geoengineering activities, e.g., mining, reservoir impoundment, oil/gas production, water exploitation or fluid injection, alter pre-existing lithostatic stress states in the earth's crust and are anticipated to trigger earthquakes. Such processes of in- situ stress alteration are termed geomechanical pollution. Moreover, since the 19th century more than 200 earthquakes have been documented worldwide with a seismic moment magnitude of 4.5
Ceramic applications in the advanced Stirling automotive engine
NASA Technical Reports Server (NTRS)
Tomazic, W. A.; Cairelli, J. E.
1977-01-01
The ideal cycle, its application to a practical machine, and the specific advantages of high efficiency, low emissions, multi-fuel capability, and low noise of the stirling engine are discussed. Certain portions of the Stirling engine must operate continuously at high temperature. Ceramics offer the potential of cost reduction and efficiency improvement for advanced engine applications. Potential applications for ceramics in Stirling engines, and some of the special problems pertinent to using ceramics in the Stirling engine are described. The research and technology program in ceramics which is planned to support the development of advanced Stirling engines is outlined.
Connecting slow earthquakes to huge earthquakes.
Obara, Kazushige; Kato, Aitaro
2016-07-15
Slow earthquakes are characterized by a wide spectrum of fault slip behaviors and seismic radiation patterns that differ from those of traditional earthquakes. However, slow earthquakes and huge megathrust earthquakes can have common slip mechanisms and are located in neighboring regions of the seismogenic zone. The frequent occurrence of slow earthquakes may help to reveal the physics underlying megathrust events as useful analogs. Slow earthquakes may function as stress meters because of their high sensitivity to stress changes in the seismogenic zone. Episodic stress transfer to megathrust source faults leads to an increased probability of triggering huge earthquakes if the adjacent locked region is critically loaded. Careful and precise monitoring of slow earthquakes may provide new information on the likelihood of impending huge earthquakes. Copyright © 2016, American Association for the Advancement of Science.
NASA Astrophysics Data System (ADS)
Yin, Lucy; Andrews, Jennifer; Heaton, Thomas
2018-05-01
Earthquake parameter estimations using nearest neighbor searching among a large database of observations can lead to reliable prediction results. However, in the real-time application of Earthquake Early Warning (EEW) systems, the accurate prediction using a large database is penalized by a significant delay in the processing time. We propose to use a multidimensional binary search tree (KD tree) data structure to organize large seismic databases to reduce the processing time in nearest neighbor search for predictions. We evaluated the performance of KD tree on the Gutenberg Algorithm, a database-searching algorithm for EEW. We constructed an offline test to predict peak ground motions using a database with feature sets of waveform filter-bank characteristics, and compare the results with the observed seismic parameters. We concluded that large database provides more accurate predictions of the ground motion information, such as peak ground acceleration, velocity, and displacement (PGA, PGV, PGD), than source parameters, such as hypocenter distance. Application of the KD tree search to organize the database reduced the average searching process by 85% time cost of the exhaustive method, allowing the method to be feasible for real-time implementation. The algorithm is straightforward and the results will reduce the overall time of warning delivery for EEW.
NASA Astrophysics Data System (ADS)
Haase, J. S.; Bock, Y.; Saunders, J. K.; Goldberg, D.; Restrepo, J. I.
2016-12-01
As part of an effort to promote the use of NASA-sponsored Earth science information for disaster risk reduction, real-time high-rate seismogeodetic data are being incorporated into early warning and structural monitoring systems. Seismogeodesy combines seismic acceleration and GPS displacement measurements using a tightly-coupled Kalman filter to provide absolute estimates of seismic acceleration, velocity and displacement. Traditionally, the monitoring of earthquakes and tsunamis has been based on seismic networks for estimating earthquake magnitude and slip, and tide gauges and deep-ocean buoys for direct measurement of tsunami waves. Real-time seismogeodetic observations at subduction zones allow for more robust and rapid magnitude and slip estimation that increase warning time in the near-source region. A NASA-funded effort to utilize GPS and seismogeodesy in NOAA's Tsunami Warning Centers in Alaska and Hawaii integrates new modules for picking, locating, and estimating magnitudes and moment tensors for earthquakes into the USGS earthworm environment at the TWCs. In a related project, NASA supports the transition of this research to seismogeodetic tools for disaster preparedness, specifically by implementing GPS and low-cost MEMS accelerometers for structural monitoring in partnership with earthquake engineers. Real-time high-rate seismogeodetic structural monitoring has been implemented on two structures. The first is a parking garage at the Autonomous University of Baja California Faculty of Medicine in Mexicali, not far from the rupture of the 2011 Mw 7.2 El Mayor Cucapah earthquake enabled through a UCMexus collaboration. The second is the 8-story Geisel Library at University of California, San Diego (UCSD). The system has also been installed for several proof-of-concept experiments at the UCSD Network for Earthquake Engineering Simulation (NEES) Large High Performance Outdoor Shake Table. We present MEMS-based seismogeodetic observations from the 10 June
NASA Astrophysics Data System (ADS)
Ide, Satoshi; Maury, Julie
2018-04-01
Tectonic tremors, low-frequency earthquakes, very low-frequency earthquakes, and slow slip events are all regarded as components of broadband slow earthquakes, which can be modeled as a stochastic process using Brownian motion. Here we show that the Brownian slow earthquake model provides theoretical relationships among the seismic moment, seismic energy, and source duration of slow earthquakes and that this model explains various estimates of these quantities in three major subduction zones: Japan, Cascadia, and Mexico. While the estimates for these three regions are similar at the seismological frequencies, the seismic moment rates are significantly different in the geodetic observation. This difference is ascribed to the difference in the characteristic times of the Brownian slow earthquake model, which is controlled by the width of the source area. We also show that the model can include non-Gaussian fluctuations, which better explains recent findings of a near-constant source duration for low-frequency earthquake families.
Sodium heat engine system: Space application
NASA Astrophysics Data System (ADS)
Betz, Bryan H.; Sungu, Sabri; Vu, Hung V.
1994-08-01
This paper explores the possibility of utilizing the Sodium Heat Engine (SHE) or known as AMTEC (Alkali Metal Thermoelectric Converter), for electrical power generation in ``near earth'' geosynchronous orbit. The Sodium Heat Engine principle is very flexible and adapts well to a variety of physical geometries. The proposed system can be easily folded and then deployed into orbit without the need for on site assembly in space. Electric power generated from SHE engine can be used in communication satellites, in space station, and other applications such as electrical recharging of vehicles in space is one of the applications the Sodium Heat Engine could be adapted to serve.
Earthquakes on Your Dinner Table
NASA Astrophysics Data System (ADS)
Alexeev, N. A.; Tape, C.; Alexeev, V. A.
2016-12-01
Earthquakes have interesting physics applicable to other phenomena like propagation of waves, also, they affect human lives. This study focused on three questions, how: depth, distance from epicenter and ground hardness affect earthquake strength. Experimental setup consisted of a gelatin slab to simulate crust. The slab was hit with a weight and earthquake amplitude was measured. It was found that earthquake amplitude was larger when the epicenter was deeper, which contradicts observations and probably was an artifact of the design. Earthquake strength was inversely proportional to the distance from the epicenter, which generally follows reality. Soft and medium jello were implanted into hard jello. It was found that earthquakes are stronger in softer jello, which was a result of resonant amplification in soft ground. Similar results are found in Minto Flats, where earthquakes are stronger and last longer than in the nearby hills. Earthquakes waveforms from Minto Flats showed that that the oscillations there have longer periods compared to the nearby hills with harder soil. Two gelatin pieces with identical shapes and different hardness were vibrated on a platform at varying frequencies in order to demonstrate that their resonant frequencies are statistically different. This phenomenon also occurs in Yukon Flats.
Earthquakes: Recurrence and Interoccurrence Times
NASA Astrophysics Data System (ADS)
Abaimov, S. G.; Turcotte, D. L.; Shcherbakov, R.; Rundle, J. B.; Yakovlev, G.; Goltz, C.; Newman, W. I.
2008-04-01
The purpose of this paper is to discuss the statistical distributions of recurrence times of earthquakes. Recurrence times are the time intervals between successive earthquakes at a specified location on a specified fault. Although a number of statistical distributions have been proposed for recurrence times, we argue in favor of the Weibull distribution. The Weibull distribution is the only distribution that has a scale-invariant hazard function. We consider three sets of characteristic earthquakes on the San Andreas fault: (1) The Parkfield earthquakes, (2) the sequence of earthquakes identified by paleoseismic studies at the Wrightwood site, and (3) an example of a sequence of micro-repeating earthquakes at a site near San Juan Bautista. In each case we make a comparison with the applicable Weibull distribution. The number of earthquakes in each of these sequences is too small to make definitive conclusions. To overcome this difficulty we consider a sequence of earthquakes obtained from a one million year “Virtual California” simulation of San Andreas earthquakes. Very good agreement with a Weibull distribution is found. We also obtain recurrence statistics for two other model studies. The first is a modified forest-fire model and the second is a slider-block model. In both cases good agreements with Weibull distributions are obtained. Our conclusion is that the Weibull distribution is the preferred distribution for estimating the risk of future earthquakes on the San Andreas fault and elsewhere.
NASA Astrophysics Data System (ADS)
Stein, R. S.
2012-12-01
The 2004 M=9.2 Sumatra earthquake claimed what seemed an unfathomable 228,000 lives, although because of its size, we could at least assure ourselves that it was an extremely rare event. But in the short space of 8 years, the Sumatra quake no longer looks like an anomaly, and it is no longer even the worst disaster of the Century: 80,000 deaths in the 2005 M=7.6 Pakistan quake; 88,000 deaths in the 2008 M=7.9 Wenchuan, China quake; 316,000 deaths in the M=7.0 Haiti, quake. In each case, poor design and construction were unable to withstand the ferocity of the shaken earth. And this was compounded by inadequate rescue, medical care, and shelter. How could the toll continue to mount despite the advances in our understanding of quake risk? The world's population is flowing into megacities, and many of these migration magnets lie astride the plate boundaries. Caught between these opposing demographic and seismic forces are 50 cities of at least 3 million people threatened by large earthquakes, the targets of chance. What we know for certain is that no one will take protective measures unless they are convinced they are at risk. Furnishing that knowledge is the animating principle of the Global Earthquake Model, launched in 2009. At the very least, everyone should be able to learn what his or her risk is. At the very least, our community owes the world an estimate of that risk. So, first and foremost, GEM seeks to raise quake risk awareness. We have no illusions that maps or models raise awareness; instead, earthquakes do. But when a quake strikes, people need a credible place to go to answer the question, how vulnerable am I, and what can I do about it? The Global Earthquake Model is being built with GEM's new open source engine, OpenQuake. GEM is also assembling the global data sets without which we will never improve our understanding of where, how large, and how frequently earthquakes will strike, what impacts they will have, and how those impacts can be lessened by
Biocatalysts: application and engineering for industrial purposes.
Jemli, Sonia; Ayadi-Zouari, Dorra; Hlima, Hajer Ben; Bejar, Samir
2016-01-01
Enzymes are widely applied in various industrial applications and processes, including the food and beverage, animal feed, textile, detergent and medical industries. Enzymes screened from natural origins are often engineered before entering the market place because their native forms do not meet the requirements for industrial application. Protein engineering is concerned with the design and construction of novel enzymes with tailored functional properties, including stability, catalytic activity, reaction product inhibition and substrate specificity. Two broad approaches have been used for enzyme engineering, namely, rational design and directed evolution. The powerful and revolutionary techniques so far developed for protein engineering provide excellent opportunities for the design of industrial enzymes with specific properties and production of high-value products at lower production costs. The present review seeks to highlight the major fields of enzyme application and to provide an updated overview on previous protein engineering studies wherein natural enzymes were modified to meet the operational conditions required for industrial application.
NASA Astrophysics Data System (ADS)
Loyd, R.; Walter, S.; Fenton, J.; Tubbesing, S.; Greene, M.
2008-12-01
In the rush to remove debris after a damaging earthquake, perishable data related to a wide range of impacts on the physical, built and social environments can be lost. The California Post-Earthquake Information Clearinghouse is intended to prevent this data loss by supporting the earth scientists, engineers, and social and policy researchers who will conduct fieldwork in the affected areas in the hours and days following the earthquake to study these effects. First called for by Governor Ronald Reagan following the destructive M6.5 San Fernando earthquake in 1971, the concept of the Clearinghouse has since been incorporated into the response plans of the National Earthquake Hazard Reduction Program (USGS Circular 1242). This presentation is intended to acquaint scientists with the purpose, functions, and services of the Clearinghouse. Typically, the Clearinghouse is set up in the vicinity of the earthquake within 24 hours of the mainshock and is maintained for several days to several weeks. It provides a location where field researchers can assemble to share and discuss their observations, plan and coordinate subsequent field work, and communicate significant findings directly to the emergency responders and to the public through press conferences. As the immediate response effort winds down, the Clearinghouse will ensure that collected data are archived and made available through "lessons learned" reports and publications that follow significant earthquakes. Participants in the quarterly meetings of the Clearinghouse include representatives from state and federal agencies, universities, NGOs and other private groups. Overall management of the Clearinghouse is delegated to the agencies represented by the authors above.
The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment
NASA Astrophysics Data System (ADS)
Coppari, S.; Di Pasquale, G.; Goretti, A.; Papa, F.; Papa, S.; Paoli, G.; Pizza, A. G.; Severino, M.
2008-07-01
The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspection teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through wide
NASA Astrophysics Data System (ADS)
Sargeant, S.; Sorensen, M. B.
2011-12-01
More than 50% of the world's population now live in urban areas. In less developed countries, future urban population increase will be due to natural population growth and rural-to-urban migration. As urban growth continues, the vulnerability of those living in these areas is also increasing. This presents a wide variety of challenges for humanitarian organisations that often have more experience of disaster response in rural settings rather than planning for large urban disasters. The 2010 Haiti earthquake highlighted the vulnerability of these organisations and the communities that they seek to support. To meet this challenge, a key consideration is how scientific information can support the humanitarian sector and their working practices. Here we review the current state of earthquake scenario modelling practice, with special focus on scenarios to be used in disaster response and response planning, and present an evaluation of how the field looks set to evolve. We also review current good practice and lessons learned from previous earthquakes with respect to planning for and responding to earthquakes in urban settings in the humanitarian sector, identifying key sectoral priorities. We then investigate the interface between these two areas to investigate the use of earthquake scenarios in disaster response planning and identify potential challenges both with respect to development of scientific models and their application on the ground.
Engineering β-sheet peptide assemblies for biomedical applications.
Yu, Zhiqiang; Cai, Zheng; Chen, Qiling; Liu, Menghua; Ye, Ling; Ren, Jiaoyan; Liao, Wenzhen; Liu, Shuwen
2016-03-01
Hydrogels have been widely studied in various biomedical applications, such as tissue engineering, cell culture, immunotherapy and vaccines, and drug delivery. Peptide-based nanofibers represent a promising new strategy for current drug delivery approaches and cell carriers for tissue engineering. This review focuses on the recent advances in the use of self-assembling engineered β-sheet peptide assemblies for biomedical applications. The applications of peptide nanofibers in biomedical fields, such as drug delivery, tissue engineering, immunotherapy, and vaccines, are highlighted. The current challenges and future perspectives for self-assembling peptide nanofibers in biomedical applications are discussed.
POST Earthquake Debris Management - AN Overview
NASA Astrophysics Data System (ADS)
Sarkar, Raju
Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction
ELECTRICAL TECHNIQUES FOR ENGINEERING APPLICATIONS.
Bisdorf, Robert J.
1985-01-01
Surface electrical geophysical methods have been used in such engineering applications as locating and delineating shallow gravel deposits, depth to bedrock, faults, clay zones, and other geological phenomena. Other engineering applications include determining water quality, tracing ground water contaminant plumes and locating dam seepages. Various methods and electrode arrays are employed to solve particular geological problems. The sensitivity of a particular method or electrode array depends upon the physics on which the method is based, the array geometry, the electrical contrast between the target and host materials, and the depth to the target. Each of the available electrical methods has its own particular advantages and applications which the paper discusses.
NASA Astrophysics Data System (ADS)
Shimizu, K.; Yagi, Y.; Okuwaki, R.; Kasahara, A.
2017-12-01
The kinematic earthquake rupture models are useful to derive statistics and scaling properties of the large and great earthquakes. However, the kinematic rupture models for the same earthquake are often different from one another. Such sensitivity of the modeling prevents us to understand the statistics and scaling properties of the earthquakes. Yagi and Fukahata (2011) introduces the uncertainty of Green's function into the tele-seismic waveform inversion, and shows that the stable spatiotemporal distribution of slip-rate can be obtained by using an empirical Bayesian scheme. One of the unsolved problems in the inversion rises from the modeling error originated from an uncertainty of a fault-model setting. Green's function near the nodal plane of focal mechanism is known to be sensitive to the slight change of the assumed fault geometry, and thus the spatiotemporal distribution of slip-rate should be distorted by the modeling error originated from the uncertainty of the fault model. We propose a new method accounting for the complexity in the fault geometry by additionally solving the focal mechanism on each space knot. Since a solution of finite source inversion gets unstable with an increasing of flexibility of the model, we try to estimate a stable spatiotemporal distribution of focal mechanism in the framework of Yagi and Fukahata (2011). We applied the proposed method to the 52 tele-seismic P-waveforms of the 2013 Balochistan, Pakistan earthquake. The inverted-potency distribution shows unilateral rupture propagation toward southwest of the epicenter, and the spatial variation of the focal mechanisms shares the same pattern as the fault-curvature along the tectonic fabric. On the other hand, the broad pattern of rupture process, including the direction of rupture propagation, cannot be reproduced by an inversion analysis under the assumption that the faulting occurred on a single flat plane. These results show that the modeling error caused by simplifying the
NASA Astrophysics Data System (ADS)
Jalali, Mohammad; Ramazi, Hamidreza
2018-04-01
This article is devoted to application of a simulation algorithm based on geostatistical methods to compile and update seismotectonic provinces in which Iran has been chosen as a case study. Traditionally, tectonic maps together with seismological data and information (e.g., earthquake catalogues, earthquake mechanism, and microseismic data) have been used to update seismotectonic provinces. In many cases, incomplete earthquake catalogues are one of the important challenges in this procedure. To overcome this problem, a geostatistical simulation algorithm, turning band simulation, TBSIM, was applied to make a synthetic data to improve incomplete earthquake catalogues. Then, the synthetic data was added to the traditional information to study the seismicity homogeneity and classify the areas according to tectonic and seismic properties to update seismotectonic provinces. In this paper, (i) different magnitude types in the studied catalogues have been homogenized to moment magnitude (Mw), and earthquake declustering was then carried out to remove aftershocks and foreshocks; (ii) time normalization method was introduced to decrease the uncertainty in a temporal domain prior to start the simulation procedure; (iii) variography has been carried out in each subregion to study spatial regressions (e.g., west-southwestern area showed a spatial regression from 0.4 to 1.4 decimal degrees; the maximum range identified in the azimuth of 135 ± 10); (iv) TBSIM algorithm was then applied to make simulated events which gave rise to make 68,800 synthetic events according to the spatial regression found in several directions; (v) simulated events (i.e., magnitudes) were classified based on their intensity in ArcGIS packages and homogenous seismic zones have been determined. Finally, according to the synthetic data, tectonic features, and actual earthquake catalogues, 17 seismotectonic provinces were introduced in four major classes introduced as very high, high, moderate, and low
Earthquakes trigger the loss of groundwater biodiversity
NASA Astrophysics Data System (ADS)
Galassi, Diana M. P.; Lombardo, Paola; Fiasca, Barbara; di Cioccio, Alessia; di Lorenzo, Tiziana; Petitta, Marco; di Carlo, Piero
2014-09-01
Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and ``ecosystem engineers'', we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.
Earthquakes trigger the loss of groundwater biodiversity.
Galassi, Diana M P; Lombardo, Paola; Fiasca, Barbara; Di Cioccio, Alessia; Di Lorenzo, Tiziana; Petitta, Marco; Di Carlo, Piero
2014-09-03
Earthquakes are among the most destructive natural events. The 6 April 2009, 6.3-Mw earthquake in L'Aquila (Italy) markedly altered the karstic Gran Sasso Aquifer (GSA) hydrogeology and geochemistry. The GSA groundwater invertebrate community is mainly comprised of small-bodied, colourless, blind microcrustaceans. We compared abiotic and biotic data from two pre-earthquake and one post-earthquake complete but non-contiguous hydrological years to investigate the effects of the 2009 earthquake on the dominant copepod component of the obligate groundwater fauna. Our results suggest that the massive earthquake-induced aquifer strain biotriggered a flushing of groundwater fauna, with a dramatic decrease in subterranean species abundance. Population turnover rates appeared to have crashed, no longer replenishing the long-standing communities from aquifer fractures, and the aquifer became almost totally deprived of animal life. Groundwater communities are notorious for their low resilience. Therefore, any major disturbance that negatively impacts survival or reproduction may lead to local extinction of species, most of them being the only survivors of phylogenetic lineages extinct at the Earth surface. Given the ecological key role played by the subterranean fauna as decomposers of organic matter and "ecosystem engineers", we urge more detailed, long-term studies on the effect of major disturbances to groundwater ecosystems.
NASA Astrophysics Data System (ADS)
Perry, S.; Jordan, T.
2006-12-01
Our undergraduate research program, SCEC/UseIT, an NSF Research Experience for Undergraduates site, provides software for earthquake researchers and educators, movies for outreach, and ways to strengthen the technical career pipeline. SCEC/UseIT motivates diverse undergraduates towards science and engineering careers through team-based research in the exciting field of earthquake information technology. UseIT provides the cross-training in computer science/information technology (CS/IT) and geoscience needed to make fundamental progress in earthquake system science. Our high and increasing participation of women and minority students is crucial given the nation"s precipitous enrollment declines in CS/IT undergraduate degree programs, especially among women. UseIT also casts a "wider, farther" recruitment net that targets scholars interested in creative work but not traditionally attracted to summer science internships. Since 2002, SCEC/UseIT has challenged 79 students in three dozen majors from as many schools with difficult, real-world problems that require collaborative, interdisciplinary solutions. Interns design and engineer open-source software, creating increasingly sophisticated visualization tools (see "SCEC-VDO," session IN11), which are employed by SCEC researchers, in new curricula at the University of Southern California, and by outreach specialists who make animated movies for the public and the media. SCEC-VDO would be a valuable tool for research-oriented professional development programs.
The HayWired Earthquake Scenario—Earthquake Hazards
Detweiler, Shane T.; Wein, Anne M.
2017-04-24
The HayWired scenario is a hypothetical earthquake sequence that is being used to better understand hazards for the San Francisco Bay region during and after an earthquake of magnitude 7 on the Hayward Fault. The 2014 Working Group on California Earthquake Probabilities calculated that there is a 33-percent likelihood of a large (magnitude 6.7 or greater) earthquake occurring on the Hayward Fault within three decades. A large Hayward Fault earthquake will produce strong ground shaking, permanent displacement of the Earth’s surface, landslides, liquefaction (soils becoming liquid-like during shaking), and subsequent fault slip, known as afterslip, and earthquakes, known as aftershocks. The most recent large earthquake on the Hayward Fault occurred on October 21, 1868, and it ruptured the southern part of the fault. The 1868 magnitude-6.8 earthquake occurred when the San Francisco Bay region had far fewer people, buildings, and infrastructure (roads, communication lines, and utilities) than it does today, yet the strong ground shaking from the earthquake still caused significant building damage and loss of life. The next large Hayward Fault earthquake is anticipated to affect thousands of structures and disrupt the lives of millions of people. Earthquake risk in the San Francisco Bay region has been greatly reduced as a result of previous concerted efforts; for example, tens of billions of dollars of investment in strengthening infrastructure was motivated in large part by the 1989 magnitude 6.9 Loma Prieta earthquake. To build on efforts to reduce earthquake risk in the San Francisco Bay region, the HayWired earthquake scenario comprehensively examines the earthquake hazards to help provide the crucial scientific information that the San Francisco Bay region can use to prepare for the next large earthquake, The HayWired Earthquake Scenario—Earthquake Hazards volume describes the strong ground shaking modeled in the scenario and the hazardous movements of
USGS GNSS Applications to Earthquake Disaster Response and Hazard Mitigation
NASA Astrophysics Data System (ADS)
Hudnut, K. W.; Murray, J. R.; Minson, S. E.
2015-12-01
Rapid characterization of earthquake rupture is important during a disaster because it establishes which fault ruptured and the extent and amount of fault slip. These key parameters, in turn, can augment in situ seismic sensors for identifying disruption to lifelines as well as localized damage along the fault break. Differential GNSS station positioning, along with imagery differencing, are important methods for augmenting seismic sensors. During response to recent earthquakes (1989 Loma Prieta, 1992 Landers, 1994 Northridge, 1999 Hector Mine, 2010 El Mayor - Cucapah, 2012 Brawley Swarm and 2014 South Napa earthquakes), GNSS co-seismic and post-seismic observations proved to be essential for rapid earthquake source characterization. Often, we find that GNSS results indicate key aspects of the earthquake source that would not have been known in the absence of GNSS data. Seismic, geologic, and imagery data alone, without GNSS, would miss important details of the earthquake source. That is, GNSS results provide important additional insight into the earthquake source properties, which in turn help understand the relationship between shaking and damage patterns. GNSS also adds to understanding of the distribution of slip along strike and with depth on a fault, which can help determine possible lifeline damage due to fault offset, as well as the vertical deformation and tilt that are vitally important for gravitationally driven water systems. The GNSS processing work flow that took more than one week 25 years ago now takes less than one second. Formerly, portable receivers needed to be set up at a site, operated for many hours, then data retrieved, processed and modeled by a series of manual steps. The establishment of continuously telemetered, continuously operating high-rate GNSS stations and the robust automation of all aspects of data retrieval and processing, has led to sub-second overall system latency. Within the past few years, the final challenges of
Application of Advanced Materials in Petroleum Engineering
NASA Astrophysics Data System (ADS)
Zhao, Gufan; Di, Weina; Wang, Minsheng
With the background of increasing requirements on the petroleum engineering technology from more high demanding exploration targets, global oil companies and oil service companies are making more efforts on both R&D and application of new petroleum engineering technology. Advanced materials always have a decisive role in the functionality of a new product. Technology transplantation has become the important means of innovation in oil and gas industry. Here, we mainly discuss the properties and scope of application of several advanced materials. Based on the material requirements in petroleum engineering, we provide several candidates for downhole electronics protection, drilling fluid additives, downhole tools, etc. Based on the analysis of petroleum engineering technology characteristics, this paper made analysis and research on such advanced materials as new insulation materials, functional gradient materials, self-healing polymers, and introduced their application prospect in petroleum engineering in terms of specific characteristics.
Cognitive engineering in aerospace applications
NASA Technical Reports Server (NTRS)
Woods, David D.
1993-01-01
The progress that was made with respect to the objectives and goals of the research that is being carried out in the Cognitive Systems Engineering Laboratory (CSEL) under a Cooperative Agreement with NASA Ames Research Center is described. The major objective of this project is to expand the research base in Cognitive Engineering to be able to support the development and human-centered design of automated systems for aerospace applications. This research project is in support of the Aviation Safety/Automation Research plan and related NASA research goals in space applications.
Earthquake Hazard Assessment: an Independent Review
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2016-04-01
Seismic hazard assessment (SHA), from term-less (probabilistic PSHA or deterministic DSHA) to time-dependent (t-DASH) including short-term earthquake forecast/prediction (StEF), is not an easy task that implies a delicate application of statistics to data of limited size and different accuracy. Regretfully, in many cases of SHA, t-DASH, and StEF, the claims of a high potential and efficiency of the methodology are based on a flawed application of statistics and hardly suitable for communication to decision makers. The necessity and possibility of applying the modified tools of Earthquake Prediction Strategies, in particular, the Error Diagram, introduced by G.M. Molchan in early 1990ies for evaluation of SHA, and the Seismic Roulette null-hypothesis as a measure of the alerted space, is evident, and such a testing must be done in advance claiming hazardous areas and/or times. The set of errors, i.e. the rates of failure and of the alerted space-time volume, compared to those obtained in the same number of random guess trials permits evaluating the SHA method effectiveness and determining the optimal choice of the parameters in regard to specified cost-benefit functions. These and other information obtained in such a testing may supply us with a realistic estimate of confidence in SHA results and related recommendations on the level of risks for decision making in regard to engineering design, insurance, and emergency management. These basics of SHA evaluation are exemplified with a few cases of misleading "seismic hazard maps", "precursors", and "forecast/prediction methods".
Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward
1989-01-01
The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk
46 CFR 13.501 - Original application for tankerman-engineer endorsement.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 1 2014-10-01 2014-10-01 false Original application for tankerman-engineer endorsement... AND SEAMEN CERTIFICATION OF TANKERMEN Requirements for Tankerman-Engineer Endorsement § 13.501 Original application for tankerman-engineer endorsement. Each applicant for a tankerman-engineer...
Leveraging geodetic data to reduce losses from earthquakes
Murray, Jessica R.; Roeloffs, Evelyn A.; Brooks, Benjamin A.; Langbein, John O.; Leith, William S.; Minson, Sarah E.; Svarc, Jerry L.; Thatcher, Wayne R.
2018-04-23
event response products and by expanded use of geodetic imaging data to assess fault rupture and source parameters.Uncertainties in the NSHM, and in regional earthquake models, are reduced by fully incorporating geodetic data into earthquake probability calculations.Geodetic networks and data are integrated into the operations and earthquake information products of the Advanced National Seismic System (ANSS).Earthquake early warnings are improved by more rapidly assessing ground displacement and the dynamic faulting process for the largest earthquakes using real-time geodetic data.Methodology for probabilistic earthquake forecasting is refined by including geodetic data when calculating evolving moment release during aftershock sequences and by better understanding the implications of transient deformation for earthquake likelihood.A geodesy program that encompasses a balanced mix of activities to sustain missioncritical capabilities, grows new competencies through the continuum of fundamental to applied research, and ensures sufficient resources for these endeavors provides a foundation by which the EHP can be a leader in the application of geodesy to earthquake science. With this in mind the following objectives provide a framework to guide EHP efforts:Fully utilize geodetic information to improve key products, such as the NSHM and EEW, and to address new ventures like the USGS Subduction Zone Science Plan.Expand the variety, accuracy, and timeliness of post-earthquake information products, such as PAGER (Prompt Assessment of Global Earthquakes for Response), through incorporation of geodetic observations.Determine if geodetic measurements of transient deformation can significantly improve estimates of earthquake probability.Maintain an observational strategy aligned with the target outcomes of this document that includes continuous monitoring, recording of ephemeral observations, focused data collection for use in research, and application-driven data processing and
Harp, E.L.; Noble, M.A.
1993-01-01
Investigations of earthquakes world wide show that rock falls are the most abundant type of landslide that is triggered by earthquakes. An engineering classification originally used in tunnel design, known as the rock mass quality designation (Q), was modified for use in rating the susceptibility of rock slopes to seismically-induced failure. Analysis of rock-fall concentrations and Q-values for the 1980 earthquake sequence near Mammoth Lakes, California, defines a well-constrained upper bound that shows the number of rock falls per site decreases rapidly with increasing Q. Because of the similarities of lithology and slope between the Eastern Sierra Nevada Range near Mammoth Lakes and the Wasatch Front near Salt Lake City, Utah, the probabilities derived from analysis of the Mammoth Lakes region were used to predict rock-fall probabilities for rock slopes near Salt Lake City in response to a magnitude 6.0 earthquake. These predicted probabilities were then used to generalize zones of rock-fall susceptibility. -from Authors
NASA Astrophysics Data System (ADS)
Perry, S.; Benthien, M.; Jordan, T. H.
2005-12-01
The SCEC/UseIT internship program is training the next generation of earthquake scientist, with methods that can be adapted to other disciplines. UseIT interns work collaboratively, in multi-disciplinary teams, conducting computer science research that is needed by earthquake scientists. Since 2002, the UseIT program has welcomed 64 students, in some two dozen majors, at all class levels, from schools around the nation. Each summer''s work is posed as a ``Grand Challenge.'' The students then organize themselves into project teams, decide how to proceed, and pool their diverse talents and backgrounds. They have traditional mentors, who provide advice and encouragement, but they also mentor one another, and this has proved to be a powerful relationship. Most begin with fear that their Grand Challenge is impossible, and end with excitement and pride about what they have accomplished. The 22 UseIT interns in summer, 2005, were primarily computer science and engineering majors, with others in geology, mathematics, English, digital media design, physics, history, and cinema. The 2005 Grand Challenge was to "build an earthquake monitoring system" to aid scientists who must visualize rapidly evolving earthquake sequences and convey information to emergency personnel and the public. Most UseIT interns were engaged in software engineering, bringing new datasets and functionality to SCEC-VDO (Virtual Display of Objects), a 3D visualization software that was prototyped by interns last year, using Java3D and an extensible, plug-in architecture based on the Eclipse Integrated Development Environment. Other UseIT interns used SCEC-VDO to make animated movies, and experimented with imagery in order to communicate concepts and events in earthquake science. One movie-making project included the creation of an assessment to test the effectiveness of the movie''s educational message. Finally, one intern created an interactive, multimedia presentation of the UseIT program.
NASA Astrophysics Data System (ADS)
Tang, Hongliang; Kang, Chengxu; Tian, Youping
2018-01-01
Realizing the online handling of administrative approval of earthquakes is an important measure to improve work efficiency and facilitate people’s convenience. Based on the analysis of the characteristics and processes of the administrative licensing in the earthquake industry, this paper proposes an online processing model based on ASP technology and an online processing system based on B/S architecture. This paper presents the design and implementation methods. The application of the system shows that the system is simple in design and full in function, and can be used on mobile platforms such as computers and mobile phones, and has good practicability and forward-lookingness.
40 CFR 94.221 - Application of good engineering judgment.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 21 2012-07-01 2012-07-01 false Application of good engineering... § 94.221 Application of good engineering judgment. (a) The manufacturer shall exercise good engineering... the Administrator) a written description of the engineering judgment in question. (c) The...
40 CFR 94.221 - Application of good engineering judgment.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 20 2010-07-01 2010-07-01 false Application of good engineering... § 94.221 Application of good engineering judgment. (a) The manufacturer shall exercise good engineering... the Administrator) a written description of the engineering judgment in question. (c) The...
40 CFR 94.221 - Application of good engineering judgment.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 21 2013-07-01 2013-07-01 false Application of good engineering... § 94.221 Application of good engineering judgment. (a) The manufacturer shall exercise good engineering... the Administrator) a written description of the engineering judgment in question. (c) The...
40 CFR 94.221 - Application of good engineering judgment.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 20 2014-07-01 2013-07-01 true Application of good engineering... § 94.221 Application of good engineering judgment. (a) The manufacturer shall exercise good engineering... the Administrator) a written description of the engineering judgment in question. (c) The...
40 CFR 94.221 - Application of good engineering judgment.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 20 2011-07-01 2011-07-01 false Application of good engineering... § 94.221 Application of good engineering judgment. (a) The manufacturer shall exercise good engineering... the Administrator) a written description of the engineering judgment in question. (c) The...
Cramer, C.H.; Kumar, A.
2003-01-01
Engineering seismoscope data collected at distances less than 300 km for the M 7.7 Bhuj, India, mainshock are compatible with ground-motion attenuation in eastern North America (ENA). The mainshock ground-motion data have been corrected to a common geological site condition using the factors of Joyner and Boore (2000) and a classification scheme of Quaternary or Tertiary sediments or rock. We then compare these data to ENA ground-motion attenuation relations. Despite uncertainties in recording method, geological site corrections, common tectonic setting, and the amount of regional seismic attenuation, the corrected Bhuj dataset agrees with the collective predictions by ENA ground-motion attenuation relations within a factor of 2. This level of agreement is within the dataset uncertainties and the normal variance for recorded earthquake ground motions.
... Search Term(s): Main Content Home Be Informed Earthquakes Earthquakes An earthquake is the sudden, rapid shaking of the earth, ... by the breaking and shifting of underground rock. Earthquakes can cause buildings to collapse and cause heavy ...
Reflections from the interface between seismological research and earthquake risk reduction
NASA Astrophysics Data System (ADS)
Sargeant, S.
2012-04-01
Scientific understanding of earthquakes and their attendant hazards is vital for the development of effective earthquake risk reduction strategies. Within the global disaster reduction policy framework (the Hyogo Framework for Action, overseen by the UN International Strategy for Disaster Reduction), the anticipated role of science and scientists is clear, with respect to risk assessment, loss estimation, space-based observation, early warning and forecasting. The importance of information sharing and cooperation, cross-disciplinary networks and developing technical and institutional capacity for effective disaster management is also highlighted. In practice, the degree to which seismological information is successfully delivered to and applied by individuals, groups or organisations working to manage or reduce the risk from earthquakes is variable. The challenge for scientists is to provide fit-for-purpose information that can be integrated simply into decision-making and risk reduction activities at all levels of governance and at different geographic scales, often by a non-technical audience (i.e. people without any seismological/earthquake engineering training). The interface between seismological research and earthquake risk reduction (defined here in terms of both the relationship between the science and its application, and the scientist and other risk stakeholders) is complex. This complexity is a function of a range issues that arise relating to communication, multidisciplinary working, politics, organisational practices, inter-organisational collaboration, working practices, sectoral cultures, individual and organisational values, worldviews and expectations. These factors can present significant obstacles to scientific information being incorporated into the decision-making process. The purpose of this paper is to present some personal reflections on the nature of the interface between the worlds of seismological research and risk reduction, and the
Industrial and Systems Engineering Applications in NASA
NASA Technical Reports Server (NTRS)
Shivers, Charles H.
2006-01-01
A viewgraph presentation on the many applications of Industrial and Systems Engineering used for safe NASA missions is shown. The topics include: 1) NASA Information; 2) Industrial Engineering; 3) Systems Engineering; and 4) Major NASA Programs.
Fiberoptic sensors for rocket engine applications
NASA Technical Reports Server (NTRS)
Ballard, R. O.
1992-01-01
A research effort was completed to summarize and evaluate the current level of technology in fiberoptic sensors for possible applications in integrated control and health monitoring (ICHM) systems in liquid propellant engines. The environment within a rocket engine is particuarly severe with very high temperatures and pressures present combined with extremely rapid fluid and gas flows, and high-velocity and high-intensity acoustc waves. Application of fiberoptic technology to rocket engine health monitoring is a logical evolutionary step in ICHM development and presents a significant challenge. In this extremely harsh environment, the additional flexibility of fiberoptic techniques to augment conventional sensor technologies offer abundant future potential.
POST Earthquake Debris Management — AN Overview
NASA Astrophysics Data System (ADS)
Sarkar, Raju
Every year natural disasters, such as fires, floods, earthquakes, hurricanes, landslides, tsunami, and tornadoes, challenge various communities of the world. Earthquakes strike with varying degrees of severity and pose both short- and long-term challenges to public service providers. Earthquakes generate shock waves and displace the ground along fault lines. These seismic forces can bring down buildings and bridges in a localized area and damage buildings and other structures in a far wider area. Secondary damage from fires, explosions, and localized flooding from broken water pipes can increase the amount of debris. Earthquake debris includes building materials, personal property, and sediment from landslides. The management of this debris, as well as the waste generated during the reconstruction works, can place significant challenges on the national and local capacities. Debris removal is a major component of every post earthquake recovery operation. Much of the debris generated from earthquake is not hazardous. Soil, building material, and green waste, such as trees and shrubs, make up most of the volume of earthquake debris. These wastes not only create significant health problems and a very unpleasant living environment if not disposed of safely and appropriately, but also can subsequently impose economical burdens on the reconstruction phase. In practice, most of the debris may be either disposed of at landfill sites, reused as materials for construction or recycled into useful commodities Therefore, the debris clearance operation should focus on the geotechnical engineering approach as an important post earthquake issue to control the quality of the incoming flow of potential soil materials. In this paper, the importance of an emergency management perspective in this geotechnical approach that takes into account the different criteria related to the operation execution is proposed by highlighting the key issues concerning the handling of the construction
Akkar, Sinan; Aldemir, A.; Askan, A.; Bakir, S.; Canbay, E.; Demirel, I.O.; Erberik, M.A.; Gulerce, Z.; Gulkan, Polat; Kalkan, Erol; Prakash, S.; Sandikkaya, M.A.; Sevilgen, V.; Ugurhan, B.; Yenier, E.
2011-01-01
An earthquake of MW = 6.1 occurred in the Elazığ region of eastern Turkey on 8 March 2010 at 02:32:34 UTC. The United States Geological Survey (USGS) reported the epicenter of the earthquake as 38.873°N-39.981°E with a focal depth of 12 km. Forty-two people lost their lives and 137 were injured during the event. The earthquake was reported to be on the left-lateral strike-slip east Anatolian fault (EAF), which is one of the two major active fault systems in Turkey. Teams from the Earthquake Engineering Research Center of the Middle East Technical University (EERC-METU) visited the earthquake area in the aftermath of the mainshock. Their reconnaissance observations were combined with interpretations of recorded ground motions for completeness. This article summarizes observations on building and ground damage in the area and provides a discussion of the recorded motions. No significant observations in terms of geotechnical engineering were made.
ERIC Educational Resources Information Center
Walter, Edward J.
1977-01-01
Presents an analysis of the causes of earthquakes. Topics discussed include (1) geological and seismological factors that determine the effect of a particular earthquake on a given structure; (2) description of some large earthquakes such as the San Francisco quake; and (3) prediction of earthquakes. (HM)
Droplet-Wall/Film Impact in IC Engine Applications
2017-08-14
Report: Droplet-Wall/Film Impact in IC Engine Applications (ARO Topic 1.4.1 under ARO’s Dr. Ralph A. Anthenien) The views, opinions and/or findings...in IC Engine Applications (ARO Topic 1.4.1 under ARO’s Dr. Ralph A. Anthenien) Report Term: 0-Other Email: cklaw@princeton.edu Distribution Statement...associated with spraying in internal combustion engines (ICEs). Fuels sprayed inside engines can impact with the internal surfaces and thus not only
Yehle, Lynn A.
1977-01-01
A program to study the engineering geology of most larger Alaska coastal communities and to evaluate their earthquake and other geologic hazards was started following the 1964 Alaska earthquake; this report about the Metlakatla area, Annette Island, is a product of that program. Field-study methods were of a reconnaissance nature, and thus the interpretations in the report are tentative. Landscape of the Metlakatla Peninsula, on which the city of Metlakatla is located, is characterized by a muskeg-covered terrane of very low relief. In contrast, most of the rest of Annette Island is composed of mountainous terrane with steep valleys and numerous lakes. During the Pleistocene Epoch the Metlakatla area was presumably covered by ice several times; glaciers smoothed the present Metlakatla Peninsula and deeply eroded valleys on the rest. of Annette Island. The last major deglaciation was completed probably before 10,000 years ago. Rebound of the earth's crust, believed to be related to glacial melting, has caused land emergence at Metlakatla of at least 50 ft (15 m) and probably more than 200 ft (61 m) relative to present sea level. Bedrock in the Metlakatla area is composed chiefly of hard metamorphic rocks: greenschist and greenstone with minor hornfels and schist. Strike and dip of beds are generally variable and minor offsets are common. Bedrock is of late Paleozoic to early Mesozoic age. Six types of surficial geologic materials of Quaternary age were recognized: firm diamicton, emerged shore, modern shore and delta, and alluvial deposits, very soft muskeg and other organic deposits, and firm to soft artificial fill. A combination map unit is composed of bedrock or diamicton. Geologic structure in southeastern Alaska is complex because, since at least early Paleozoic time, there have been several cycles of tectonic deformation that affected different parts of the region. Southeastern Alaska is transected by numerous faults and possible faults that attest to major
The HayWired Earthquake Scenario
Detweiler, Shane T.; Wein, Anne M.
2017-04-24
ForewordThe 1906 Great San Francisco earthquake (magnitude 7.8) and the 1989 Loma Prieta earthquake (magnitude 6.9) each motivated residents of the San Francisco Bay region to build countermeasures to earthquakes into the fabric of the region. Since Loma Prieta, bay-region communities, governments, and utilities have invested tens of billions of dollars in seismic upgrades and retrofits and replacements of older buildings and infrastructure. Innovation and state-of-the-art engineering, informed by science, including novel seismic-hazard assessments, have been applied to the challenge of increasing seismic resilience throughout the bay region. However, as long as people live and work in seismically vulnerable buildings or rely on seismically vulnerable transportation and utilities, more work remains to be done.With that in mind, the U.S. Geological Survey (USGS) and its partners developed the HayWired scenario as a tool to enable further actions that can change the outcome when the next major earthquake strikes. By illuminating the likely impacts to the present-day built environment, well-constructed scenarios can and have spurred officials and citizens to take steps that change the outcomes the scenario describes, whether used to guide more realistic response and recovery exercises or to launch mitigation measures that will reduce future risk.The HayWired scenario is the latest in a series of like-minded efforts to bring a special focus onto the impacts that could occur when the Hayward Fault again ruptures through the east side of the San Francisco Bay region as it last did in 1868. Cities in the east bay along the Richmond, Oakland, and Fremont corridor would be hit hardest by earthquake ground shaking, surface fault rupture, aftershocks, and fault afterslip, but the impacts would reach throughout the bay region and far beyond. The HayWired scenario name reflects our increased reliance on the Internet and telecommunications and also alludes to the
Rapid estimate of earthquake source duration: application to tsunami warning.
NASA Astrophysics Data System (ADS)
Reymond, Dominique; Jamelot, Anthony; Hyvernaud, Olivier
2016-04-01
We present a method for estimating the source duration of the fault rupture, based on the high-frequency envelop of teleseismic P-Waves, inspired from the original work of (Ni et al., 2005). The main interest of the knowledge of this seismic parameter is to detect abnormal low velocity ruptures that are the characteristic of the so called 'tsunami-earthquake' (Kanamori, 1972). The validation of the results of source duration estimated by this method are compared with two other independent methods : the estimated duration obtained by the Wphase inversion (Kanamori and Rivera, 2008, Duputel et al., 2012) and the duration calculated by the SCARDEC process that determines the source time function (M. Vallée et al., 2011). The estimated source duration is also confronted to the slowness discriminant defined by Newman and Okal, 1998), that is calculated routinely for all earthquakes detected by our tsunami warning process (named PDFM2, Preliminary Determination of Focal Mechanism, (Clément and Reymond, 2014)). Concerning the point of view of operational tsunami warning, the numerical simulations of tsunami are deeply dependent on the source estimation: better is the source estimation, better will be the tsunami forecast. The source duration is not directly injected in the numerical simulations of tsunami, because the cinematic of the source is presently totally ignored (Jamelot and Reymond, 2015). But in the case of a tsunami-earthquake that occurs in the shallower part of the subduction zone, we have to consider a source in a medium of low rigidity modulus; consequently, for a given seismic moment, the source dimensions will be decreased while the slip distribution increased, like a 'compact' source (Okal, Hébert, 2007). Inversely, a rapid 'snappy' earthquake that has a poor tsunami excitation power, will be characterized by higher rigidity modulus, and will produce weaker displacement and lesser source dimensions than 'normal' earthquake. References: CLément, J
Composite material application for liquid rocket engines
NASA Technical Reports Server (NTRS)
Heubner, S. W.
1982-01-01
With increasing emphasis on improving engine thrust-to-weight ratios to provide improved payload capabilities, weight reductions achievable by the use of composites have become attractive. Of primary significance is the weight reduction offered by composites, although high temperature properties and cost reduction were also considered. The potential for application of composites to components of Earth-to-orbit hydrocarbon engines and orbit-to-orbit LOX/H2 engines was assessed. The components most likely to benefit from the application of composites were identified, as were the critical technology areas where developed would be required. Recommendations were made and a program outlined for the design, fabrication, and demonstration of specific engine components.
Aloe Vera for Tissue Engineering Applications
Rahman, Shekh; Carter, Princeton; Bhattarai, Narayan
2017-01-01
Aloe vera, also referred as Aloe barbadensis Miller, is a succulent plant widely used for biomedical, pharmaceutical and cosmetic applications. Aloe vera has been used for thousands of years. However, recent significant advances have been made in the development of aloe vera for tissue engineering applications. Aloe vera has received considerable attention in tissue engineering due to its biodegradability, biocompatibility, and low toxicity properties. Aloe vera has been reported to have many biologically active components. The bioactive components of aloe vera have effective antibacterial, anti-inflammatory, antioxidant, and immune-modulatory effects that promote both tissue regeneration and growth. The aloe vera plant, its bioactive components, extraction and processing, and tissue engineering prospects are reviewed in this article. The use of aloe vera as tissue engineering scaffolds, gels, and films is discussed, with a special focus on electrospun nanofibers. PMID:28216559
Aloe Vera for Tissue Engineering Applications.
Rahman, Shekh; Carter, Princeton; Bhattarai, Narayan
2017-02-14
Aloe vera, also referred as Aloe barbadensis Miller, is a succulent plant widely used for biomedical, pharmaceutical and cosmetic applications. Aloe vera has been used for thousands of years. However, recent significant advances have been made in the development of aloe vera for tissue engineering applications. Aloe vera has received considerable attention in tissue engineering due to its biodegradability, biocompatibility, and low toxicity properties. Aloe vera has been reported to have many biologically active components. The bioactive components of aloe vera have effective antibacterial, anti-inflammatory, antioxidant, and immune-modulatory effects that promote both tissue regeneration and growth. The aloe vera plant, its bioactive components, extraction and processing, and tissue engineering prospects are reviewed in this article. The use of aloe vera as tissue engineering scaffolds, gels, and films is discussed, with a special focus on electrospun nanofibers.
NASA Astrophysics Data System (ADS)
Kaneko, Yoshihiro; Wallace, Laura M.; Hamling, Ian J.; Gerstenberger, Matthew C.
2018-05-01
Slow slip events (SSEs) have been documented in subduction zones worldwide, yet their implications for future earthquake occurrence are not well understood. Here we develop a relatively simple, simulation-based method for estimating the probability of megathrust earthquakes following tectonic events that induce any transient stress perturbations. This method has been applied to the locked Hikurangi megathrust (New Zealand) surrounded on all sides by the 2016 Kaikoura earthquake and SSEs. Our models indicate the annual probability of a M≥7.8 earthquake over 1 year after the Kaikoura earthquake increases by 1.3-18 times relative to the pre-Kaikoura probability, and the absolute probability is in the range of 0.6-7%. We find that probabilities of a large earthquake are mainly controlled by the ratio of the total stressing rate induced by all nearby tectonic sources to the mean stress drop of earthquakes. Our method can be applied to evaluate the potential for triggering a megathrust earthquake following SSEs in other subduction zones.
ERIC Educational Resources Information Center
Pakiser, Louis C.
One of a series of general interest publications on science topics, the booklet provides those interested in earthquakes with an introduction to the subject. Following a section presenting an historical look at the world's major earthquakes, the booklet discusses earthquake-prone geographic areas, the nature and workings of earthquakes, earthquake…
Expanding Applications of SERS through Versatile Nanomaterials Engineering (Postprint)
2017-06-22
AFRL-RX-WP-JA-2017-0341 EXPANDING APPLICATIONS OF SERS THROUGH VERSATILE NANOMATERIALS ENGINEERING (POSTPRINT) M. Fernanda...AND SUBTITLE EXPANDING APPLICATIONS OF SERS THROUGH VERSATILE NANOMATERIALS ENGINEERING (POSTPRINT) 5a. CONTRACT NUMBER FA8650-15-2-5518 5b...Expanding applications of SERS through versatile nanomaterials engineering M. Fernanda Cardinal, Emma Vander Ende, Ryan A. Hackler, Michael O. McAnally
Engineering Stem Cells for Biomedical Applications
Yin, Perry T.; Han, Edward
2018-01-01
Stem cells are characterized by a number of useful properties, including their ability to migrate, differentiate, and secrete a variety of therapeutic molecules such as immunomodulatory factors. As such, numerous pre-clinical and clinical studies have utilized stem cell-based therapies and demonstrated their tremendous potential for the treatment of various human diseases and disorders. Recently, efforts have focused on engineering stem cells in order to further enhance their innate abilities as well as to confer them with new functionalities, which can then be used in various biomedical applications. These engineered stem cells can take on a number of forms. For instance, engineered stem cells encompass the genetic modification of stem cells as well as the use of stem cells for gene delivery, nanoparticle loading and delivery, and even small molecule drug delivery. The present Review gives an in-depth account of the current status of engineered stem cells, including potential cell sources, the most common methods used to engineer stem cells, and the utilization of engineered stem cells in various biomedical applications, with a particular focus on tissue regeneration, the treatment of immunodeficiency diseases, and cancer. PMID:25772134
Engineering Stem Cells for Biomedical Applications.
Yin, Perry T; Han, Edward; Lee, Ki-Bum
2016-01-07
Stem cells are characterized by a number of useful properties, including their ability to migrate, differentiate, and secrete a variety of therapeutic molecules such as immunomodulatory factors. As such, numerous pre-clinical and clinical studies have utilized stem cell-based therapies and demonstrated their tremendous potential for the treatment of various human diseases and disorders. Recently, efforts have focused on engineering stem cells in order to further enhance their innate abilities as well as to confer them with new functionalities, which can then be used in various biomedical applications. These engineered stem cells can take on a number of forms. For instance, engineered stem cells encompass the genetic modification of stem cells as well as the use of stem cells for gene delivery, nanoparticle loading and delivery, and even small molecule drug delivery. The present Review gives an in-depth account of the current status of engineered stem cells, including potential cell sources, the most common methods used to engineer stem cells, and the utilization of engineered stem cells in various biomedical applications, with a particular focus on tissue regeneration, the treatment of immunodeficiency diseases, and cancer. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Are seismic hazard assessment errors and earthquake surprises unavoidable?
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir
2013-04-01
Why earthquake occurrences bring us so many surprises? The answer seems evident if we review the relationships that are commonly used to assess seismic hazard. The time-span of physically reliable Seismic History is yet a small portion of a rupture recurrence cycle at an earthquake-prone site, which makes premature any kind of reliable probabilistic statements about narrowly localized seismic hazard. Moreover, seismic evidences accumulated to-date demonstrate clearly that most of the empirical relations commonly accepted in the early history of instrumental seismology can be proved erroneous when testing statistical significance is applied. Seismic events, including mega-earthquakes, cluster displaying behaviors that are far from independent or periodic. Their distribution in space is possibly fractal, definitely, far from uniform even in a single segment of a fault zone. Such a situation contradicts generally accepted assumptions used for analytically tractable or computer simulations and complicates design of reliable methodologies for realistic earthquake hazard assessment, as well as search and definition of precursory behaviors to be used for forecast/prediction purposes. As a result, the conclusions drawn from such simulations and analyses can MISLEAD TO SCIENTIFICALLY GROUNDLESS APPLICATION, which is unwise and extremely dangerous in assessing expected societal risks and losses. For example, a systematic comparison of the GSHAP peak ground acceleration estimates with those related to actual strong earthquakes, unfortunately, discloses gross inadequacy of this "probabilistic" product, which appears UNACCEPTABLE FOR ANY KIND OF RESPONSIBLE SEISMIC RISK EVALUATION AND KNOWLEDGEABLE DISASTER PREVENTION. The self-evident shortcomings and failures of GSHAP appeals to all earthquake scientists and engineers for an urgent revision of the global seismic hazard maps from the first principles including background methodologies involved, such that there becomes: (a) a
Earthquake Hazard Analysis Methods: A Review
NASA Astrophysics Data System (ADS)
Sari, A. M.; Fakhrurrozi, A.
2018-02-01
One of natural disasters that have significantly impacted on risks and damage is an earthquake. World countries such as China, Japan, and Indonesia are countries located on the active movement of continental plates with more frequent earthquake occurrence compared to other countries. Several methods of earthquake hazard analysis have been done, for example by analyzing seismic zone and earthquake hazard micro-zonation, by using Neo-Deterministic Seismic Hazard Analysis (N-DSHA) method, and by using Remote Sensing. In its application, it is necessary to review the effectiveness of each technique in advance. Considering the efficiency of time and the accuracy of data, remote sensing is used as a reference to the assess earthquake hazard accurately and quickly as it only takes a limited time required in the right decision-making shortly after the disaster. Exposed areas and possibly vulnerable areas due to earthquake hazards can be easily analyzed using remote sensing. Technological developments in remote sensing such as GeoEye-1 provide added value and excellence in the use of remote sensing as one of the methods in the assessment of earthquake risk and damage. Furthermore, the use of this technique is expected to be considered in designing policies for disaster management in particular and can reduce the risk of natural disasters such as earthquakes in Indonesia.
A Viscoelastic earthquake simulator with application to the San Francisco Bay region
Pollitz, Fred F.
2009-01-01
Earthquake simulation on synthetic fault networks carries great potential for characterizing the statistical patterns of earthquake occurrence. I present an earthquake simulator based on elastic dislocation theory. It accounts for the effects of interseismic tectonic loading, static stress steps at the time of earthquakes, and postearthquake stress readjustment through viscoelastic relaxation of the lower crust and mantle. Earthquake rupture initiation and termination are determined with a Coulomb failure stress criterion and the static cascade model. The simulator is applied to interacting multifault systems: one, a synthetic two-fault network, and the other, a fault network representative of the San Francisco Bay region. The faults are discretized both along strike and along dip and can accommodate both strike slip and dip slip. Stress and seismicity functions are evaluated over 30,000 yr trial time periods, resulting in a detailed statistical characterization of the fault systems. Seismicity functions such as the coefficient of variation and a- and b-values exhibit systematic patterns with respect to simple model parameters. This suggests that reliable estimation of the controlling parameters of an earthquake simulator is a prerequisite to the interpretation of its output in terms of seismic hazard.
A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities
Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.
1999-01-01
A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.
Geotechnical effects of the 2015 magnitude 7.8 Gorkha, Nepal, earthquake and aftershocks
Moss, Robb E. S.; Thompson, Eric M.; Kieffer, D Scott; Tiwari, Binod; Hashash, Youssef M A; Acharya, Indra; Adhikari, Basanta; Asimaki, Domniki; Clahan, Kevin B.; Collins, Brian D.; Dahal, Sachindra; Jibson, Randall W.; Khadka, Diwakar; Macdonald, Amy; Madugo, Chris L M; Mason, H Benjamin; Pehlivan, Menzer; Rayamajhi, Deepak; Uprety, Sital
2015-01-01
This article summarizes the geotechnical effects of the 25 April 2015 M 7.8 Gorkha, Nepal, earthquake and aftershocks, as documented by a reconnaissance team that undertook a broad engineering and scientific assessment of the damage and collected perishable data for future analysis. Brief descriptions are provided of ground shaking, surface fault rupture, landsliding, soil failure, and infrastructure performance. The goal of this reconnaissance effort, led by Geotechnical Extreme Events Reconnaissance, is to learn from earthquakes and mitigate hazards in future earthquakes.
NASA Astrophysics Data System (ADS)
Touati, Sarah; Naylor, Mark; Main, Ian
2016-02-01
The recent spate of mega-earthquakes since 2004 has led to speculation of an underlying change in the global `background' rate of large events. At a regional scale, detecting changes in background rate is also an important practical problem for operational forecasting and risk calculation, for example due to volcanic processes, seismicity induced by fluid injection or withdrawal, or due to redistribution of Coulomb stress after natural large events. Here we examine the general problem of detecting changes in background rate in earthquake catalogues with and without correlated events, for the first time using the Bayes factor as a discriminant for models of varying complexity. First we use synthetic Poisson (purely random) and Epidemic-Type Aftershock Sequence (ETAS) models (which also allow for earthquake triggering) to test the effectiveness of many standard methods of addressing this question. These fall into two classes: those that evaluate the relative likelihood of different models, for example using Information Criteria or the Bayes Factor; and those that evaluate the probability of the observations (including extreme events or clusters of events) under a single null hypothesis, for example by applying the Kolmogorov-Smirnov and `runs' tests, and a variety of Z-score tests. The results demonstrate that the effectiveness among these tests varies widely. Information Criteria worked at least as well as the more computationally expensive Bayes factor method, and the Kolmogorov-Smirnov and runs tests proved to be the relatively ineffective in reliably detecting a change point. We then apply the methods tested to events at different thresholds above magnitude M ≥ 7 in the global earthquake catalogue since 1918, after first declustering the catalogue. This is most effectively done by removing likely correlated events using a much lower magnitude threshold (M ≥ 5), where triggering is much more obvious. We find no strong evidence that the background rate of large
Study of small turbofan engines applicable to single-engine light airplanes
NASA Technical Reports Server (NTRS)
Merrill, G. L.
1976-01-01
The design, efficiency and cost factors are investigated for application of turbofan propulsion engines to single engine, general aviation light airplanes. A companion study of a hypothetical engine family of a thrust range suitable to such aircraft and having a high degree of commonality of design features and parts is presented. Future turbofan powered light airplanes can have a lower fuel consumption, lower weight, reduced airframe maintenance requirements and improved engine overhaul periods as compared to current piston engined powered airplanes. Achievement of compliance with noise and chemical emission regulations is expected without impairing performance, operating cost or safety.
The TRIPOD e-learning Platform for the Training of Earthquake Safety Assessment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Coppari, S.; Di Pasquale, G.; Goretti, A.
2008-07-08
The paper summarizes the results of the in progress EU Project titled TRIPOD (Training Civil Engineers on Post-Earthquake Safety Assessment of Damaged Buildings), funded under the Leonardo Da Vinci program. The main theme of the project is the development of a methodology and a learning platform for the training of technicians involved in post-earthquake building safety inspections. In the event of a catastrophic earthquake, emergency building inspections constitute a major undertaking with severe social impact. Given the inevitable chaotic conditions and the urgent need of a great number of specialized individuals to carry out inspections, past experience indicates that inspectionmore » teams are often formed in an adhoc manner, under stressful conditions, at a varying levels of technical expertise and experience, sometime impairing the reliability and consistency of the inspection results. Furthermore each Country has its own building damage and safety assessment methodology, developed according to its experience, laws, building technology and seismicity. This holds also for the partners participating to the project (Greece, Italy, Turkey, Cyprus), that all come from seismically sensitive Mediterranean countries. The project aims at alleviating the above shortcomings by designing and developing a training methodology and e-platform, forming a complete training program targeted at inspection engineers, specialized personnel and civil protection agencies. The e-learning platform will provide flexible and friendly authoring mechanisms, self-teaching and assessment capabilities, course and trainee management, etc. Courses will be also made available as stand-alone multimedia applications on CD and in the form of a complete pocket handbook. Moreover the project will offer the possibility of upgrading different experiences and practices: a first step towards the harmonization of methodologies and tools of different Countries sharing similar problems. Finally, through
Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward
1989-01-01
The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk
CMC Technology Advancements for Gas Turbine Engine Applications
NASA Technical Reports Server (NTRS)
Grady, Joseph E.
2013-01-01
CMC research at NASA Glenn is focused on aircraft propulsion applications. The objective is to enable reduced engine emissions and fuel consumption for more environmentally friendly aircraft. Engine system studies show that incorporation of ceramic composites into turbine engines will enable significant reductions in emissions and fuel burn due to increased engine efficiency resulting from reduced cooling requirements for hot section components. This presentation will describe recent progress and challenges in developing fiber and matrix constituents for 2700 F CMC turbine applications. In addition, ongoing research in the development of durable environmental barrier coatings, ceramic joining integration technologies and life prediction methods for CMC engine components will be reviewed.
Metal Matrix Composites for Rocket Engine Applications
NASA Technical Reports Server (NTRS)
McDonald, Kathleen R.; Wooten, John R.
2000-01-01
This document is from a presentation about the applications of Metal Matrix Composites (MMC) in rocket engines. Both NASA and the Air Force have goals which would reduce the costs and the weight of launching spacecraft. Charts show the engine weight distribution for both reuseable and expendable engine components. The presentation reviews the operating requirements for several components of the rocket engines. The next slide reviews the potential benefits of MMCs in general and in use as materials for Advanced Pressure Casting. The next slide reviews the drawbacks of MMCs. The reusable turbopump housing is selected to review for potential MMC application. The presentation reviews solutions for reusable turbopump materials, pointing out some of the issues. It also reviews the development of some of the materials.
Assessing the Applicability of Earthquake Early Warning in Nicaragua.
NASA Astrophysics Data System (ADS)
Massin, F.; Clinton, J. F.; Behr, Y.; Strauch, W.; Cauzzi, C.; Boese, M.; Talavera, E.; Tenorio, V.; Ramirez, J.
2016-12-01
Nicaragua, like much of Central America, suffers from frequent damaging earthquakes (6 M7+ earthquakes occurred in the last 100 years). Thrust events occur at the Middle America Trench where the Cocos plate subducts by 72-81 mm/yr eastward beneath the Caribbean plate. Shallow crustal events occur on-shore, with potential extensive damage as demonstrated in 1972 by a M6.2 earthquake, 5 km beneath Managua. This seismotectonic setting is challenging for Earthquake Early Warning (EEW) because the target events derive from both the offshore seismicity, with potentially large lead times but uncertain locations, and shallow seismicity in close proximity to densely urbanized areas, where an early warning would be short if available at all. Nevertheless, EEW could reduce Nicaragua's earthquake exposure. The Swiss Development and Cooperation Fund and the Nicaraguan Government have funded a collaboration between the Swiss Seismological Service (SED) at ETH Zurich and the Nicaraguan Geosciences Institute (INETER) in Managua to investigate and build a prototype EEW system for Nicaragua and the wider region. In this contribution, we present the potential of EEW to effectively alert Nicaragua and the neighbouring regions. We model alert time delays using all available seismic stations (existing and planned) in the region, as well as communication and processing delays (observed and optimal) to estimate current and potential performances of EEW alerts. Theoretical results are verified with the output from the Virtual Seismologist in SeisComP3 (VS(SC3)). VS(SC3) is implemented in the INETER SeisComP3 system for real-time operation and as an offline instance, that simulates real-time operation, to record processing delays of playback events. We compare our results with similar studies for Europe, California and New Zealand. We further highlight current capabilities and challenges for providing EEW alerts in Nicaragua. We also discuss how combining different algorithms, like e.g. VS
NASA Astrophysics Data System (ADS)
Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.
2004-12-01
The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better
A global building inventory for earthquake loss estimation and risk management
Jaiswal, K.; Wald, D.; Porter, K.
2010-01-01
We develop a global database of building inventories using taxonomy of global building types for use in near-real-time post-earthquake loss estimation and pre-earthquake risk analysis, for the U.S. Geological Survey's Prompt Assessment of Global Earthquakes for Response (PAGER) program. The database is available for public use, subject to peer review, scrutiny, and open enhancement. On a country-by-country level, it contains estimates of the distribution of building types categorized by material, lateral force resisting system, and occupancy type (residential or nonresidential, urban or rural). The database draws on and harmonizes numerous sources: (1) UN statistics, (2) UN Habitat's demographic and health survey (DHS) database, (3) national housing censuses, (4) the World Housing Encyclopedia and (5) other literature. ?? 2010, Earthquake Engineering Research Institute.
Protein engineering and its applications in food industry.
Kapoor, Swati; Rafiq, Aasima; Sharma, Savita
2017-07-24
Protein engineering is a young discipline that has been branched out from the field of genetic engineering. Protein engineering is based on the available knowledge about the proteins structure/function(s), tools/instruments, software, bioinformatics database, available cloned gene, knowledge about available protein, vectors, recombinant strains and other materials that could lead to change in the protein backbone. Protein produced properly from genetic engineering process means a protein that is able to fold correctly and to do particular function(s) efficiently even after being subjected to engineering practices. Protein is modified through its gene or chemically. However, modification of protein through gene is easier. There is no specific limitation of Protein Engineering tools; any technique that can lead to change the protein constituent of amino acid and result in the modification of protein structure/function is in the frame of Protein Engineering. Meanwhile, there are some common tools used to reach a specific target. More active industrial and pharmaceutical based proteins have been invented by the field of Protein Engineering to introduce new function as well as to change its interaction with surrounding environment. A variety of protein engineering applications have been reported in the literature. These applications range from biocatalysis for food and industry to environmental, medical and nanobiotechnology applications. Successful combinations of various protein engineering methods had led to successful results in food industries and have created a scope to maintain the quality of finished product after processing.
Charles Darwin's earthquake reports
NASA Astrophysics Data System (ADS)
Galiev, Shamil
2010-05-01
problems which began to discuss only during the last time. Earthquakes often precede volcanic eruptions. According to Darwin, the earthquake-induced shock may be a common mechanism of the simultaneous eruptions of the volcanoes separated by long distances. In particular, Darwin wrote that ‘… the elevation of many hundred square miles of territory near Concepcion is part of the same phenomenon, with that splashing up, if I may so call it, of volcanic matter through the orifices in the Cordillera at the moment of the shock;…'. According to Darwin the crust is a system where fractured zones, and zones of seismic and volcanic activities interact. Darwin formulated the task of considering together the processes studied now as seismology and volcanology. However the difficulties are such that the study of interactions between earthquakes and volcanoes began only recently and his works on this had relatively little impact on the development of geosciences. In this report, we discuss how the latest data on seismic and volcanic events support the Darwin's observations and ideas about the 1835 Chilean earthquake. The material from researchspace. auckland. ac. nz/handle/2292/4474 is used. We show how modern mechanical tests from impact engineering and simple experiments with weakly-cohesive materials also support his observations and ideas. On the other hand, we developed the mathematical theory of the earthquake-induced catastrophic wave phenomena. This theory allow to explain the most important aspects the Darwin's earthquake reports. This is achieved through the simplification of fundamental governing equations of considering problems to strongly-nonlinear wave equations. Solutions of these equations are constructed with the help of analytic and numerical techniques. The solutions can model different strongly-nonlinear wave phenomena which generate in a variety of physical context. A comparison with relevant experimental observations is also presented.
Historical and recent large megathrust earthquakes in Chile
NASA Astrophysics Data System (ADS)
Ruiz, S.; Madariaga, R.
2018-05-01
Recent earthquakes in Chile, 2014, Mw 8.2 Iquique, 2015, Mw 8.3 Illapel and 2016, Mw 7.6 Chiloé have put in evidence some problems with the straightforward application of ideas about seismic gaps, earthquake periodicity and the general forecast of large megathrust earthquakes. In northern Chile, before the 2014 Iquique earthquake 4 large earthquakes were reported in written chronicles, 1877, 1786, 1615 and 1543; in North-Central Chile, before the 2015 Illapel event, 3 large earthquakes 1943, 1880, 1730 were reported; and the 2016 Chiloé earthquake occurred in the southern zone of the 1960 Valdivia megathrust rupture, where other large earthquakes occurred in 1575, 1737 and 1837. The periodicity of these events has been proposed as a good long-term forecasting. However, the seismological aspects of historical Chilean earthquakes were inferred mainly from old chronicles written before subduction in Chile was discovered. Here we use the original description of earthquakes to re-analyze the historical archives. Our interpretation shows that a-priori ideas, like seismic gaps and characteristic earthquakes, influenced the estimation of magnitude, location and rupture area of the older Chilean events. On the other hand, the advance in the characterization of the rheological aspects that controlled the contact between Nazca and South-American plate and the study of tsunami effects provide better estimations of the location of historical earthquakes along the seismogenic plate interface. Our re-interpretation of historical earthquakes shows a large diversity of earthquakes types; there is a major difference between giant earthquakes that break the entire plate interface and those of Mw 8.0 that only break a portion of it.
An earthquake happens when two blocks of the earth suddenly slip past one another. Earthquakes strike suddenly, violently, and without warning at any time of the day or night. If an earthquake occurs in a populated area, it may cause ...
Nanofibers and their applications in tissue engineering
Vasita, Rajesh; Katti, Dhirendra S
2006-01-01
Developing scaffolds that mimic the architecture of tissue at the nanoscale is one of the major challenges in the field of tissue engineering. The development of nanofibers has greatly enhanced the scope for fabricating scaffolds that can potentially meet this challenge. Currently, there are three techniques available for the synthesis of nanofibers: electrospinning, self-assembly, and phase separation. Of these techniques, electrospinning is the most widely studied technique and has also demonstrated the most promising results in terms of tissue engineering applications. The availability of a wide range of natural and synthetic biomaterials has broadened the scope for development of nanofibrous scaffolds, especially using the electrospinning technique. The three dimensional synthetic biodegradable scaffolds designed using nanofibers serve as an excellent framework for cell adhesion, proliferation, and differentiation. Therefore, nanofibers, irrespective of their method of synthesis, have been used as scaffolds for musculoskeletal tissue engineering (including bone, cartilage, ligament, and skeletal muscle), skin tissue engineering, vascular tissue engineering, neural tissue engineering, and as carriers for the controlled delivery of drugs, proteins, and DNA. This review summarizes the currently available techniques for nanofiber synthesis and discusses the use of nanofibers in tissue engineering and drug delivery applications. PMID:17722259
NASA Astrophysics Data System (ADS)
Castaldini, D.; Genevois, R.; Panizza, M.; Puccinelli, A.; Berti, M.; Simoni, A.
This paper illustrates research addressing the subject of the earthquake-induced surface effects by means of a multidisciplinary approach: tectonics, neotectonics, seismology, geology, hydrogeology, geomorphology, soil/rock mechanics have been considered. The research is aimed to verify in areas affected by earthquake-triggered landslides a methodology for the identification of potentially unstable areas. The research was organized according to regional and local scale studies. In order to better emphasise the complexity of the relationships between all the parameters affecting the stability conditions of rock slopes in static and dynamic conditions a new integrated approach, Rock Engineering Systems (RES), was applied in the Northern Apennines. In the paper, the different phases of the research are described in detail and an example of the application of RES method in a sample area is reported. A significant aspect of the study can be seen in its attempt to overcome the exclusively qualitative aspects of research into the relationship between earthquakes and induced surface effects, and to advance the idea of beginning a process by which this interaction can be quantified.
Defining "Acceptable Risk" for Earthquakes Worldwide
NASA Astrophysics Data System (ADS)
Tucker, B.
2001-05-01
The greatest and most rapidly growing earthquake risk for mortality is in developing countries. Further, earthquake risk management actions of the last 50 years have reduced the average lethality of earthquakes in earthquake-threatened industrialized countries. (This is separate from the trend of the increasing fiscal cost of earthquakes there.) Despite these clear trends, every new earthquake in developing countries is described in the media as a "wake up" call, announcing the risk these countries face. GeoHazards International (GHI) works at both the community and the policy levels to try to reduce earthquake risk. GHI reduces death and injury by helping vulnerable communities recognize their risk and the methods to manage it, by raising awareness of its risk, building local institutions to manage that risk, and strengthening schools to protect and train the community's future generations. At the policy level, GHI, in collaboration with research partners, is examining whether "acceptance" of these large risks by people in these countries and by international aid and development organizations explains the lack of activity in reducing these risks. The goal of this pilot project - The Global Earthquake Safety Initiative (GESI) - is to develop and evaluate a means of measuring the risk and the effectiveness of risk mitigation actions in the world's largest, most vulnerable cities: in short, to develop an earthquake risk index. One application of this index is to compare the risk and the risk mitigation effort of "comparable" cities. By this means, Lima, for example, can compare the risk of its citizens dying due to earthquakes with the risk of citizens in Santiago and Guayaquil. The authorities of Delhi and Islamabad can compare the relative risk from earthquakes of their school children. This index can be used to measure the effectiveness of alternate mitigation projects, to set goals for mitigation projects, and to plot progress meeting those goals. The preliminary
Applications of the gambling score in evaluating earthquake predictions and forecasts
NASA Astrophysics Data System (ADS)
Zhuang, Jiancang; Zechar, Jeremy D.; Jiang, Changsheng; Console, Rodolfo; Murru, Maura; Falcone, Giuseppe
2010-05-01
This study presents a new method, namely the gambling score, for scoring the performance earthquake forecasts or predictions. Unlike most other scoring procedures that require a regular scheme of forecast and treat each earthquake equally, regardless their magnitude, this new scoring method compensates the risk that the forecaster has taken. Starting with a certain number of reputation points, once a forecaster makes a prediction or forecast, he is assumed to have betted some points of his reputation. The reference model, which plays the role of the house, determines how many reputation points the forecaster can gain if he succeeds, according to a fair rule, and also takes away the reputation points bet by the forecaster if he loses. This method is also extended to the continuous case of point process models, where the reputation points betted by the forecaster become a continuous mass on the space-time-magnitude range of interest. For discrete predictions, we apply this method to evaluate performance of Shebalin's predictions made by using the Reverse Tracing of Precursors (RTP) algorithm and of the outputs of the predictions from the Annual Consultation Meeting on Earthquake Tendency held by China Earthquake Administration. For the continuous case, we use it to compare the probability forecasts of seismicity in the Abruzzo region before and after the L'aquila earthquake based on the ETAS model and the PPE model.
NASA Astrophysics Data System (ADS)
Dandoulaki, M.; Kourou, A.; Panoutsopoulou, M.
2009-04-01
It is vastly accepted that earthquake education is the way to earthquake protection. Nonetheless experience demonstrates that knowing what to do does not necessarily result in a better behaviour in case of a real earthquake. A research project titled: "Seismopolis" - "Pilot integrated System for Public Familiarization with Earthquakes and Information on Earthquake Protection" aimed at the improvement of the behaviour of people through an appropriate amalgamation of knowledge transfer and virtually experiencing an earthquake situation. Seismopolis combines well established education means such as books and leaflets with new technologies like earthquake simulation and virtual reality. It comprises a series of 5 main spaces that the visitor passes one-by-one. Space 1. Reception and introductory information. Visitors are given fundamental information on earthquakes and earthquake protection, as well as on the appropriate behaviour in case of an earthquake. Space 2. Earthquake simulation room Visitors experience an earthquake in a room. A typical kitchen is set on a shake table area (3m x 6m planar triaxial shake table) and is shaken in both horizontal and vertical directions by introducing seismographs of real or virtual earthquakes. Space 3. Virtual reality room Visitors may have the opportunity to virtually move around in the building or in the city after an earthquake disaster and take action as in a real-life situation, wearing stereoscopic glasses and using navigation tools. Space 4. Information and resources library Visitors are offered the opportunity to know more about earthquake protection. A series of means are available for this, some developed especially for Seismopolis (3 books, 2 Cds, a website and an interactive table game). Space 5. De-briefing area Visitors may be subjected to a pedagogical and psychological evaluation at the end of their visit and offered support if needed. For the evaluation of the "Seismopolis" Centre, a pilot application of the
Borcherdt, R.D.; Glassmoyer, Gary; Cranswick, Edward
1989-01-01
The earthquakes of December 7, 1988, near Spitak, Armenia SSR, serve as another grim reminder of the serious hazard that earthquakes pose throughout the world. We extend our heartfelt sympathies to the families of the earthquake victims and intend that our cooperative scientific endeavours will help reduce losses in future earthquakes. Only through a better understanding of earthquake hazards can earthquake losses be reduced for all peoples in seismically active regions of the world.The tragic consequences of these earthquakes remind scientists and public officials alike of their urgent responsibilities to understand and mitigate the effects of earthquakes. On behalf of the U.S. Geological Survey, I would like to express appreciation to our Soviet colleagues for their kind invitation to participate in joint scientific and engineering studies. Without their cooperation and generous assistance, the conduct of these studies would not have been possible.This report provides seismologic and geologic data collected during the time period December 21, 1988, through February 2, 1989. These data are presented in their entirety to expedite analysis of the data set for inferences regarding hazard mitigation actions, applicable not only in Armenia but other regions of the world exposed to high seismic risk.
46 CFR 113.35-15 - Mechanical engine order telegraph systems; application.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 4 2012-10-01 2012-10-01 false Mechanical engine order telegraph systems; application...) ELECTRICAL ENGINEERING COMMUNICATION AND ALARM SYSTEMS AND EQUIPMENT Engine Order Telegraph Systems § 113.35-15 Mechanical engine order telegraph systems; application. If a mechanical engine order telegraph...
46 CFR 113.35-15 - Mechanical engine order telegraph systems; application.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 4 2013-10-01 2013-10-01 false Mechanical engine order telegraph systems; application...) ELECTRICAL ENGINEERING COMMUNICATION AND ALARM SYSTEMS AND EQUIPMENT Engine Order Telegraph Systems § 113.35-15 Mechanical engine order telegraph systems; application. If a mechanical engine order telegraph...
46 CFR 113.35-15 - Mechanical engine order telegraph systems; application.
Code of Federal Regulations, 2014 CFR
2014-10-01
... 46 Shipping 4 2014-10-01 2014-10-01 false Mechanical engine order telegraph systems; application...) ELECTRICAL ENGINEERING COMMUNICATION AND ALARM SYSTEMS AND EQUIPMENT Engine Order Telegraph Systems § 113.35-15 Mechanical engine order telegraph systems; application. If a mechanical engine order telegraph...
46 CFR 113.35-15 - Mechanical engine order telegraph systems; application.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 4 2011-10-01 2011-10-01 false Mechanical engine order telegraph systems; application...) ELECTRICAL ENGINEERING COMMUNICATION AND ALARM SYSTEMS AND EQUIPMENT Engine Order Telegraph Systems § 113.35-15 Mechanical engine order telegraph systems; application. If a mechanical engine order telegraph...
ERIC Educational Resources Information Center
Bautista, Nazan Uludag; Peters, Kari Nichole
2010-01-01
Can students build a house that is cost effective and strong enough to survive strong winds, heavy rains, and earthquakes? First graders in Ms. Peter's classroom worked like engineers to answer this question. They participated in a design challenge that required them to plan like engineers and build strong and cost-effective houses that would fit…
Supercomputing meets seismology in earthquake exhibit
Blackwell, Matt; Rodger, Arthur; Kennedy, Tom
2018-02-14
When the California Academy of Sciences created the "Earthquake: Evidence of a Restless Planet" exhibit, they called on Lawrence Livermore to help combine seismic research with the latest data-driven visualization techniques. The outcome is a series of striking visualizations of earthquakes, tsunamis and tectonic plate evolution. Seismic-wave research is a core competency at Livermore. While most often associated with earthquakes, the research has many other applications of national interest, such as nuclear explosion monitoring, explosion forensics, energy exploration, and seismic acoustics. For the Academy effort, Livermore researchers simulated the San Andreas and Hayward fault events at high resolutions. Such calculations require significant computational resources. To simulate the 1906 earthquake, for instance, visualizing 125 seconds of ground motion required over 1 billion grid points, 10,000 time steps, and 7.5 hours of processor time on 2,048 cores of Livermore's Sierra machine.
Exploring Earthquakes in Real-Time
NASA Astrophysics Data System (ADS)
Bravo, T. K.; Kafka, A. L.; Coleman, B.; Taber, J. J.
2013-12-01
Earthquakes capture the attention of students and inspire them to explore the Earth. Adding the ability to view and explore recordings of significant and newsworthy earthquakes in real-time makes the subject even more compelling. To address this opportunity, the Incorporated Research Institutions for Seismology (IRIS), in collaboration with Moravian College, developed ';jAmaSeis', a cross-platform application that enables students to access real-time earthquake waveform data. Students can watch as the seismic waves are recorded on their computer, and can be among the first to analyze the data from an earthquake. jAmaSeis facilitates student centered investigations of seismological concepts using either a low-cost educational seismograph or streamed data from other educational seismographs or from any seismic station that sends data to the IRIS Data Management System. After an earthquake, students can analyze the seismograms to determine characteristics of earthquakes such as time of occurrence, distance from the epicenter to the station, magnitude, and location. The software has been designed to provide graphical clues to guide students in the analysis and assist in their interpretations. Since jAmaSeis can simultaneously record up to three stations from anywhere on the planet, there are numerous opportunities for student driven investigations. For example, students can explore differences in the seismograms from different distances from an earthquake and compare waveforms from different azimuthal directions. Students can simultaneously monitor seismicity at a tectonic plate boundary and in the middle of the plate regardless of their school location. This can help students discover for themselves the ideas underlying seismic wave propagation, regional earthquake hazards, magnitude-frequency relationships, and the details of plate tectonics. The real-time nature of the data keeps the investigations dynamic, and offers students countless opportunities to explore.
Geodetic Finite-Fault-based Earthquake Early Warning Performance for Great Earthquakes Worldwide
NASA Astrophysics Data System (ADS)
Ruhl, C. J.; Melgar, D.; Grapenthin, R.; Allen, R. M.
2017-12-01
GNSS-based earthquake early warning (EEW) algorithms estimate fault-finiteness and unsaturated moment magnitude for the largest, most damaging earthquakes. Because large events are infrequent, algorithms are not regularly exercised and insufficiently tested on few available datasets. The Geodetic Alarm System (G-larmS) is a GNSS-based finite-fault algorithm developed as part of the ShakeAlert EEW system in the western US. Performance evaluations using synthetic earthquakes offshore Cascadia showed that G-larmS satisfactorily recovers magnitude and fault length, providing useful alerts 30-40 s after origin time and timely warnings of ground motion for onshore urban areas. An end-to-end test of the ShakeAlert system demonstrated the need for GNSS data to accurately estimate ground motions in real-time. We replay real data from several subduction-zone earthquakes worldwide to demonstrate the value of GNSS-based EEW for the largest, most damaging events. We compare predicted ground acceleration (PGA) from first-alert-solutions with those recorded in major urban areas. In addition, where applicable, we compare observed tsunami heights to those predicted from the G-larmS solutions. We show that finite-fault inversion based on GNSS-data is essential to achieving the goals of EEW.
Creating a Global Building Inventory for Earthquake Loss Assessment and Risk Management
Jaiswal, Kishor; Wald, David J.
2008-01-01
Earthquakes have claimed approximately 8 million lives over the last 2,000 years (Dunbar, Lockridge and others, 1992) and fatality rates are likely to continue to rise with increased population and urbanizations of global settlements especially in developing countries. More than 75% of earthquake-related human casualties are caused by the collapse of buildings or structures (Coburn and Spence, 2002). It is disheartening to note that large fractions of the world's population still reside in informal, poorly-constructed & non-engineered dwellings which have high susceptibility to collapse during earthquakes. Moreover, with increasing urbanization half of world's population now lives in urban areas (United Nations, 2001), and half of these urban centers are located in earthquake-prone regions (Bilham, 2004). The poor performance of most building stocks during earthquakes remains a primary societal concern. However, despite this dark history and bleaker future trends, there are no comprehensive global building inventories of sufficient quality and coverage to adequately address and characterize future earthquake losses. Such an inventory is vital both for earthquake loss mitigation and for earthquake disaster response purposes. While the latter purpose is the motivation of this work, we hope that the global building inventory database described herein will find widespread use for other mitigation efforts as well. For a real-time earthquake impact alert system, such as U.S. Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER), (Wald, Earle and others, 2006), we seek to rapidly evaluate potential casualties associated with earthquake ground shaking for any region of the world. The casualty estimation is based primarily on (1) rapid estimation of the ground shaking hazard, (2) aggregating the population exposure within different building types, and (3) estimating the casualties from the collapse of vulnerable buildings. Thus, the
Practical Application of Sociology in Systems Engineering
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Andrews, James G.; Eckley, Jeri Cassel; Culver, Michael L.
2017-01-01
Systems engineering involves both the integration of the system and the integration of the disciplines which develop and operate the system. Integrating the disciplines is a sociological effort to bring together different groups, who often have different terminology, to achieve a common goal, the system. The focus for the systems engineer is information flow through the organization, between the disciplines, to ensure the system is developed and operated will all relevant information informing system decisions. The practical application of the sociology in systems engineering brings in various organizational development concepts including the principles of planned renegotiation and the application of principles to address information barriers created by organizational culture. Concepts such as specification of ignorance, consistent terminology, opportunity structures, role-sets, and the reclama (reconsideration) process are all important sociological approaches that help address the organizational social structure (culture). In bringing the disciplines together, the systems engineer must also be wary of social ambivalence, social anomie, social dysfunction, and insider-outsider behavior. Unintended consequences can result when these social issues are present. These issues can occur when localized subcultures shift from the overarching organizational culture, or when the organizational culture prevents achievement of system goals. These sociological principles provide the systems engineer with key approaches to manage the information flow through the organization as the disciplines are integrated and share their information and provides key sociological barriers to information flow through the organization. This paper will discuss the practical application of sociological principles to systems engineering.
Evaluation of seismic performance of reinforced concrete (RC) buildings under near-field earthquakes
NASA Astrophysics Data System (ADS)
Moniri, Hassan
2017-03-01
Near-field ground motions are significantly severely affected on seismic response of structure compared with far-field ground motions, and the reason is that the near-source forward directivity ground motions contain pulse-long periods. Therefore, the cumulative effects of far-fault records are minor. The damage and collapse of engineering structures observed in the last decades' earthquakes show the potential of damage in existing structures under near-field ground motions. One important subject studied by earthquake engineers as part of a performance-based approach is the determination of demand and collapse capacity under near-field earthquake. Different methods for evaluating seismic structural performance have been suggested along with and as part of the development of performance-based earthquake engineering. This study investigated the results of illustrious characteristics of near-fault ground motions on the seismic response of reinforced concrete (RC) structures, by the use of Incremental Nonlinear Dynamic Analysis (IDA) method. Due to the fact that various ground motions result in different intensity-versus-response plots, this analysis is done again under various ground motions in order to achieve significant statistical averages. The OpenSees software was used to conduct nonlinear structural evaluations. Numerical modelling showed that near-source outcomes cause most of the seismic energy from the rupture to arrive in a single coherent long-period pulse of motion and permanent ground displacements. Finally, a vulnerability of RC building can be evaluated against pulse-like near-fault ground motions effects.
Biomedical applications engineering tasks
NASA Technical Reports Server (NTRS)
Laenger, C. J., Sr.
1976-01-01
The engineering tasks performed in response to needs articulated by clinicians are described. Initial contacts were made with these clinician-technology requestors by the Southwest Research Institute NASA Biomedical Applications Team. The basic purpose of the program was to effectively transfer aerospace technology into functional hardware to solve real biomedical problems.
Chapter B. The Loma Prieta, California, Earthquake of October 17, 1989 - Highway Systems
Yashinsky, Mark
1998-01-01
This paper summarizes the impact of the Loma Prieta earthquake on highway systems. City streets, urban freeways, county roads, state routes, and the national highway system were all affected. There was damage to bridges, roads, tunnels, and other highway structures. The most serious damage occurred in the cities of San Francisco and Oakland, 60 miles from the fault rupture. The cost to repair and replace highways damaged by this earthquake was $2 billion. About half of this cost was to replace the Cypress Viaduct, a long, elevated double-deck expressway that had a devastating collapse which resulted in 42 deaths and 108 injuries. The earthquake also resulted in some positive changes for highway systems. Research on bridges and earthquakes began to be funded at a much higher level. Retrofit programs were started to upgrade the seismic performance of the nation's highways. The Loma Prieta earthquake changed earthquake policy and engineering practice for highway departments not only in California, but all over the world.
NASA Astrophysics Data System (ADS)
Kaneda, Y.; Takahashi, N.; Hori, T.; Kawaguchi, K.; Isouchi, C.; Fujisawa, K.
2017-12-01
Destructive natural disasters such as earthquakes and tsunamis have occurred frequently in the world. For instance, 2004 Sumatra Earthquake in Indonesia, 2008 Wenchuan Earthquake in China, 2010 Chile Earthquake and 2011 Tohoku Earthquake in Japan etc., these earthquakes generated very severe damages. For the reduction and mitigation of damages by destructive natural disasters, early detection of natural disasters and speedy and proper evacuations are indispensable. And hardware and software developments/preparations for reduction and mitigation of natural disasters are quite important. In Japan, DONET as the real time monitoring system on the ocean floor is developed and deployed around the Nankai trough seismogenic zone southwestern Japan. So, the early detection of earthquakes and tsunamis around the Nankai trough seismogenic zone will be expected by DONET. The integration of the real time data and advanced simulation researches will lead to reduce damages, however, in the resilience society, the resilience methods will be required after disasters. Actually, methods on restorations and revivals are necessary after natural disasters. We would like to propose natural disaster mitigation science for early detections, evacuations and restorations against destructive natural disasters. This means the resilience society. In natural disaster mitigation science, there are lots of research fields such as natural science, engineering, medical treatment, social science and literature/art etc. Especially, natural science, engineering and medical treatment are fundamental research fields for natural disaster mitigation, but social sciences such as sociology, geography and psychology etc. are very important research fields for restorations after natural disasters. Finally, to realize and progress disaster mitigation science, human resource cultivation is indispensable. We already carried out disaster mitigation science under `new disaster mitigation research project on Mega
NASA Astrophysics Data System (ADS)
Özyaşar, M.; Özlüdemir, M. T.
2011-06-01
Global Navigation Satellite Systems (GNSS) are space based positioning techniques and widely used in geodetic applications. Geodetic networking accomplished by engineering surveys constitutes one of these tasks. Geodetic networks are used as the base of all kinds of geodetic implementations, Co from the cadastral plans to the relevant surveying processes during the realization of engineering applications. Geodetic networks consist of control points positioned in a defined reference frame. In fact, such positional information could be useful for other studies as well. One of such fields is geodynamic studies that use the changes of positions of control stations within a network in a certain time period to understand the characteristics of tectonic movements. In Turkey, which is located in tectonically active zones and struck by major earthquakes quite frequently, the positional information obtained in engineering surveys could be very useful for earthquake related studies. For this purpose, a GPS (Global Positioning System) network of 650 stations distributed over Istanbul (Istanbul GPS Triangulation Network; abbreviated IGNA) covering the northern part of the North Anatolian Fault Zone (NAFZ) was established in 1997 and measured in 1999. From 1998 to 2004, the IGNA network was extended to 1888 stations covering an area of about 6000 km2, the whole administration area of Istanbul. All 1888 stations within the IGNA network were remeasured in 2005. In these two campaigns there existed 452 common points, and between these two campaigns two major earthquakes took place, on 17 August and 12 November 1999 with a Richter scale magnitude of 7.4 and 7.2, respectively. Several studies conducted for estimating the horizontal and vertical displacements as a result of these earthquakes on NAFZ are discussed in this paper. In geodynamic projects carried out before the earthquakes in 1999, an annual average velocity of 2-2.5 cm for the stations along the NAFZ were estimated
Compound cycle engine for helicopter application
NASA Technical Reports Server (NTRS)
Castor, Jere; Martin, John; Bradley, Curtiss
1987-01-01
The compound cycle engine (CCE) is a highly turbocharged, power-compounded, ultra-high-power-density, lightweight diesel engine. The turbomachinery is similar to a moderate-pressure-ratio, free-power-turbine gas turbine engine and the diesel core is high speed and a low compression ratio. This engine is considered a potential candidate for future military helicopter applications. Cycle thermodynamic specific fuel consumption (SFC) and engine weight analyses performed to establish general engine operating parameters and configurations are presented. An extensive performance and weight analysis based on a typical 2-hour helicopter (+30 minute reserve) mission determined final conceptual engine design. With this mission, CCE performance was compared to that of a contemporary gas turbine engine. The CCE had a 31 percent lower-fuel consumption and resulted in a 16 percent reduction in engine plus fuel and fuel tank weight. Design SFC of the CCE is 0.33 lb/hp-hr and installed wet weight is 0.43 lb/hp. The major technology development areas required for the CCE are identified and briefly discussed.
Compound cycle engine for helicopter application
NASA Technical Reports Server (NTRS)
Castor, Jere G.
1986-01-01
The Compound Cycle Engine (CCE) is a highly turbocharged, power compounded, ultra-high power density, light-weight diesel engine. The turbomachinery is similar to a moderate pressure ratio, free power turbine engine and the diesel core is high speed and a low compression ratio. This engine is considered a potential candidate for future military light helicopter applications. This executive summary presents cycle thermodynamic (SFC) and engine weight analyses performed to establish general engine operating parameters and configuration. An extensive performance and weight analysis based on a typical two hour helicopter (+30 minute reserve) mission determined final conceptual engine design. With this mission, CCE performance was compared to that of a T-800 class gas turbine engine. The CCE had a 31% lower-fuel consumption and resulted in a 16% reduction in engine plus fuel and fuel tank weight. Design SFC of the CCE is 0.33 lb-HP-HR and installed wet weight is 0.43 lbs/HP. The major technology development areas required for the CCE are identified and briefly discussed.
NASA Astrophysics Data System (ADS)
Yuan, X.; Wang, X.; Dou, A.; Ding, X.
2014-12-01
As the UAV is widely used in earthquake disaster prevention and mitigation, the efficiency of UAV image processing determines the effectiveness of its application to pre-earthquake disaster prevention, post-earthquake emergency rescue, and disaster assessment. Because of bad weather conditions after destructive earthquake, the wide field cameras captured images with serious vignetting phenomenon, which can significantly affects the speed and efficiency of image mosaic, especially the extraction of pre-earthquake building and geological structure information and also the accuracy of post-earthquake quantitative damage extraction. In this paper, an improved radial gradient correction method (IRGCM) was developed to reduce the influence from random distribution of land surface objects on the images based on radial gradient correction method (RGCM, Y. Zheng, 2008; 2013). First, a mean-value image was obtained by the average of serial UAV images. It was used as calibration instead of single images to obtain the comprehensive vignetting function by using RGCM. Then each UAV image would be corrected by the comprehensive vignetting function. A case study was done to correct the UAV images sequence, which were obtained in Lushan County after Ms7.0 Lushan, Sichuan, China earthquake occurred on April 20, 2013. The results show that the comprehensive vignetting function generated by IRGCM is more robust and accurate to express the specific optical response of camera in a particular setting. Thus it is particularly useful for correction of a mass UAV images with non-uniform illuminations. Also, the correction process was simplified and it is faster than conventional methods. After correction, the images have better radial homogeneity and clearer details, to a certain extent, which reduces the difficulties of image mosaic, and provides a better result for further analysis and damage information extraction. Further test shows also that better results were obtained by taking
NASA Astrophysics Data System (ADS)
Brocher, T. M.; Garcia, S.; Aagaard, B. T.; Boatwright, J. J.; Dawson, T.; Hellweg, M.; Knudsen, K. L.; Perkins, J.; Schwartz, D. P.; Stoffer, P. W.; Zoback, M.
2008-12-01
Last October 21st marked the 140th anniversary of the M6.8 1868 Hayward Earthquake, the last damaging earthquake on the southern Hayward Fault. This anniversary was used to help publicize the seismic hazards associated with the fault because: (1) the past five such earthquakes on the Hayward Fault occurred about 140 years apart on average, and (2) the Hayward-Rodgers Creek Fault system is the most likely (with a 31 percent probability) fault in the Bay Area to produce a M6.7 or greater earthquake in the next 30 years. To promote earthquake awareness and preparedness, over 140 public and private agencies and companies and many individual joined the public-private nonprofit 1868 Hayward Earthquake Alliance (1868alliance.org). The Alliance sponsored many activities including a public commemoration at Mission San Jose in Fremont, which survived the 1868 earthquake. This event was followed by an earthquake drill at Bay Area schools involving more than 70,000 students. The anniversary prompted the Silver Sentinel, an earthquake response exercise based on the scenario of an earthquake on the Hayward Fault conducted by Bay Area County Offices of Emergency Services. 60 other public and private agencies also participated in this exercise. The California Seismic Safety Commission and KPIX (CBS affiliate) produced professional videos designed forschool classrooms promoting Drop, Cover, and Hold On. Starting in October 2007, the Alliance and the U.S. Geological Survey held a sequence of press conferences to announce the release of new research on the Hayward Fault as well as new loss estimates for a Hayward Fault earthquake. These included: (1) a ShakeMap for the 1868 Hayward earthquake, (2) a report by the U. S. Bureau of Labor Statistics forecasting the number of employees, employers, and wages predicted to be within areas most strongly shaken by a Hayward Fault earthquake, (3) new estimates of the losses associated with a Hayward Fault earthquake, (4) new ground motion
NASA Applications and Lessons Learned in Reliability Engineering
NASA Technical Reports Server (NTRS)
Safie, Fayssal M.; Fuller, Raymond P.
2011-01-01
Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.
Rapid earthquake hazard and loss assessment for Euro-Mediterranean region
NASA Astrophysics Data System (ADS)
Erdik, Mustafa; Sesetyan, Karin; Demircioglu, Mine; Hancilar, Ufuk; Zulfikar, Can; Cakti, Eser; Kamer, Yaver; Yenidogan, Cem; Tuzun, Cuneyt; Cagnan, Zehra; Harmandar, Ebru
2010-10-01
The almost-real time estimation of ground shaking and losses after a major earthquake in the Euro-Mediterranean region was performed in the framework of the Joint Research Activity 3 (JRA-3) component of the EU FP6 Project entitled "Network of Research Infra-structures for European Seismology, NERIES". This project consists of finding the most likely location of the earthquake source by estimating the fault rupture parameters on the basis of rapid inversion of data from on-line regional broadband stations. It also includes an estimation of the spatial distribution of selected site-specific ground motion parameters at engineering bedrock through region-specific ground motion prediction equations (GMPEs) or physical simulation of ground motion. By using the Earthquake Loss Estimation Routine (ELER) software, the multi-level methodology developed for real time estimation of losses is capable of incorporating regional variability and sources of uncertainty stemming from GMPEs, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships.
Applications of Computer Graphics in Engineering
NASA Technical Reports Server (NTRS)
1975-01-01
Various applications of interactive computer graphics to the following areas of science and engineering were described: design and analysis of structures, configuration geometry, animation, flutter analysis, design and manufacturing, aircraft design and integration, wind tunnel data analysis, architecture and construction, flight simulation, hydrodynamics, curve and surface fitting, gas turbine engine design, analysis, and manufacturing, packaging of printed circuit boards, spacecraft design.
Update earthquake risk assessment in Cairo, Egypt
NASA Astrophysics Data System (ADS)
Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan
2017-07-01
The Cairo earthquake (12 October 1992; m b = 5.8) is still and after 25 years one of the most painful events and is dug into the Egyptians memory. This is not due to the strength of the earthquake but due to the accompanied losses and damages (561 dead; 10,000 injured and 3000 families lost their homes). Nowadays, the most frequent and important question that should rise is "what if this earthquake is repeated today." In this study, we simulate the same size earthquake (12 October 1992) ground motion shaking and the consequent social-economic impacts in terms of losses and damages. Seismic hazard, earthquake catalogs, soil types, demographics, and building inventories were integrated into HAZUS-MH to produce a sound earthquake risk assessment for Cairo including economic and social losses. Generally, the earthquake risk assessment clearly indicates that "the losses and damages may be increased twice or three times" in Cairo compared to the 1992 earthquake. The earthquake risk profile reveals that five districts (Al-Sahel, El Basateen, Dar El-Salam, Gharb, and Madinat Nasr sharq) lie in high seismic risks, and three districts (Manshiyat Naser, El-Waily, and Wassat (center)) are in low seismic risk level. Moreover, the building damage estimations reflect that Gharb is the highest vulnerable district. The analysis shows that the Cairo urban area faces high risk. Deteriorating buildings and infrastructure make the city particularly vulnerable to earthquake risks. For instance, more than 90 % of the estimated buildings damages are concentrated within the most densely populated (El Basateen, Dar El-Salam, Gharb, and Madinat Nasr Gharb) districts. Moreover, about 75 % of casualties are in the same districts. Actually, an earthquake risk assessment for Cairo represents a crucial application of the HAZUS earthquake loss estimation model for risk management. Finally, for mitigation, risk reduction, and to improve the seismic performance of structures and assure life safety
Tissue engineering for clinical applications.
Bhatia, Sujata K
2010-12-01
Tissue engineering is increasingly being recognized as a beneficial means for lessening the global disease burden. One strategy of tissue engineering is to replace lost tissues or organs with polymeric scaffolds that contain specialized populations of living cells, with the goal of regenerating tissues to restore normal function. Typical constructs for tissue engineering employ biocompatible and degradable polymers, along with organ-specific and tissue-specific cells. Once implanted, the construct guides the growth and development of new tissues; the polymer scaffold degrades away to be replaced by healthy functioning tissue. The ideal biomaterial for tissue engineering not only defends against disease and supports weakened tissues or organs, it also provides the elements required for healing and repair, stimulates the body's intrinsic immunological and regenerative capacities, and seamlessly interacts with the living body. Tissue engineering has been investigated for virtually every organ system in the human body. This review describes the potential of tissue engineering to alleviate disease, as well as the latest advances in tissue regeneration. The discussion focuses on three specific clinical applications of tissue engineering: cardiac tissue regeneration for treatment of heart failure; nerve regeneration for treatment of stroke; and lung regeneration for treatment of chronic obstructive pulmonary disease. Copyright © 2010 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Çaktı, Eser; Ercan, Tülay; Dar, Emrullah
2017-04-01
Istanbul's vast historical and cultural heritage is under constant threat of earthquakes. Historical records report repeated damages to the city's landmark buildings. Our efforts towards earthquake protection of several buildings in Istanbul involve earthquake monitoring via structural health monitoring systems, linear and non-linear structural modelling and analysis in search of past and future earthquake performance, shake-table testing of scaled models and non-destructive testing. More recently we have been using laser technology in monitoring structural deformations and damage in five monumental buildings which are Hagia Sophia Museum and Fatih, Sultanahmet, Süleymaniye and Mihrimah Sultan Mosques. This presentation is about these efforts with special emphasis on the use of laser scanning in monitoring of edifices.
Computing Earthquake Probabilities on Global Scales
NASA Astrophysics Data System (ADS)
Holliday, James R.; Graves, William R.; Rundle, John B.; Turcotte, Donald L.
2016-03-01
Large devastating events in systems such as earthquakes, typhoons, market crashes, electricity grid blackouts, floods, droughts, wars and conflicts, and landslides can be unexpected and devastating. Events in many of these systems display frequency-size statistics that are power laws. Previously, we presented a new method for calculating probabilities for large events in systems such as these. This method counts the number of small events since the last large event and then converts this count into a probability by using a Weibull probability law. We applied this method to the calculation of large earthquake probabilities in California-Nevada, USA. In that study, we considered a fixed geographic region and assumed that all earthquakes within that region, large magnitudes as well as small, were perfectly correlated. In the present article, we extend this model to systems in which the events have a finite correlation length. We modify our previous results by employing the correlation function for near mean field systems having long-range interactions, an example of which is earthquakes and elastic interactions. We then construct an application of the method and show examples of computed earthquake probabilities.
Earthquake Damage Assessment Using Very High Resolution Satelliteimagery
NASA Astrophysics Data System (ADS)
Chiroiu, L.; André, G.; Bahoken, F.; Guillande, R.
Various studies using satellite imagery were applied in the last years in order to assess natural hazard damages, most of them analyzing the case of floods, hurricanes or landslides. For the case of earthquakes, the medium or small spatial resolution data available in the recent past did not allow a reliable identification of damages, due to the size of the elements (e.g. buildings or other structures), too small compared with the pixel size. The recent progresses of remote sensing in terms of spatial resolution and data processing makes possible a reliable damage detection to the elements at risk. Remote sensing techniques applied to IKONOS (1 meter resolution) and IRS (5 meters resolution) imagery were used in order to evaluate seismic vulnerability and post earthquake damages. A fast estimation of losses was performed using a multidisciplinary approach based on earthquake engineering and geospatial analysis. The results, integrated into a GIS database, could be transferred via satellite networks to the rescue teams deployed on the affected zone, in order to better coordinate the emergency operations. The methodology was applied to the city of Bhuj and Anjar after the 2001 Gujarat (India) Earthquake.
The Loma Prieta, California, Earthquake of October 17, 1989: Earthquake Occurrence
Coordinated by Bakun, William H.; Prescott, William H.
1993-01-01
Professional Paper 1550 seeks to understand the M6.9 Loma Prieta earthquake itself. It examines how the fault that generated the earthquake ruptured, searches for and evaluates precursors that may have indicated an earthquake was coming, reviews forecasts of the earthquake, and describes the geology of the earthquake area and the crustal forces that affect this geology. Some significant findings were: * Slip during the earthquake occurred on 35 km of fault at depths ranging from 7 to 20 km. Maximum slip was approximately 2.3 m. The earthquake may not have released all of the strain stored in rocks next to the fault and indicates a potential for another damaging earthquake in the Santa Cruz Mountains in the near future may still exist. * The earthquake involved a large amount of uplift on a dipping fault plane. Pre-earthquake conventional wisdom was that large earthquakes in the Bay area occurred as horizontal displacements on predominantly vertical faults. * The fault segment that ruptured approximately coincided with a fault segment identified in 1988 as having a 30% probability of generating a M7 earthquake in the next 30 years. This was one of more than 20 relevant earthquake forecasts made in the 83 years before the earthquake. * Calculations show that the Loma Prieta earthquake changed stresses on nearby faults in the Bay area. In particular, the earthquake reduced stresses on the Hayward Fault which decreased the frequency of small earthquakes on it. * Geological and geophysical mapping indicate that, although the San Andreas Fault can be mapped as a through going fault in the epicentral region, the southwest dipping Loma Prieta rupture surface is a separate fault strand and one of several along this part of the San Andreas that may be capable of generating earthquakes.
Using Smartphones to Detect Earthquakes
NASA Astrophysics Data System (ADS)
Kong, Q.; Allen, R. M.
2012-12-01
We are using the accelerometers in smartphones to record earthquakes. In the future, these smartphones may work as a supplement network to the current traditional network for scientific research and real-time applications. Given the potential number of smartphones, and small separation of sensors, this new type of seismic dataset has significant potential provides that the signal can be separated from the noise. We developed an application for android phones to record the acceleration in real time. These records can be saved on the local phone or transmitted back to a server in real time. The accelerometers in the phones were evaluated by comparing performance with a high quality accelerometer while located on controlled shake tables for a variety of tests. The results show that the accelerometer in the smartphone can reproduce the characteristic of the shaking very well, even the phone left freely on the shake table. The nature of these datasets is also quite different from traditional networks due to the fact that smartphones are moving around with their owners. Therefore, we must distinguish earthquake signals from other daily use. In addition to the shake table tests that accumulated earthquake records, we also recorded different human activities such as running, walking, driving etc. An artificial neural network based approach was developed to distinguish these different records. It shows a 99.7% successful rate of distinguishing earthquakes from the other typical human activities in our database. We are now at the stage ready to develop the basic infrastructure for a smartphone seismic network.
PAGER--Rapid assessment of an earthquake?s impact
Wald, D.J.; Jaiswal, K.; Marano, K.D.; Bausch, D.; Hearne, M.
2010-01-01
PAGER (Prompt Assessment of Global Earthquakes for Response) is an automated system that produces content concerning the impact of significant earthquakes around the world, informing emergency responders, government and aid agencies, and the media of the scope of the potential disaster. PAGER rapidly assesses earthquake impacts by comparing the population exposed to each level of shaking intensity with models of economic and fatality losses based on past earthquakes in each country or region of the world. Earthquake alerts--which were formerly sent based only on event magnitude and location, or population exposure to shaking--now will also be generated based on the estimated range of fatalities and economic losses.
Seismic design and engineering research at the U.S. Geological Survey
1988-01-01
The Engineering Seismology Element of the USGS Earthquake Hazards Reduction Program is responsible for the coordination and operation of the National Strong Motion Network to collect, process, and disseminate earthquake strong-motion data; and, the development of improved methodologies to estimate and predict earthquake ground motion. Instrumental observations of strong ground shaking induced by damaging earthquakes and the corresponding response of man-made structures provide the basis for estimating the severity of shaking from future earthquakes, for earthquake-resistant design, and for understanding the physics of seismologic failure in the Earth's crust.
Extreme Magnitude Earthquakes and their Economical Consequences
NASA Astrophysics Data System (ADS)
Chavez, M.; Cabrera, E.; Ashworth, M.; Perea, N.; Emerson, D.; Salazar, A.; Moulinec, C.
2011-12-01
The frequency of occurrence of extreme magnitude earthquakes varies from tens to thousands of years, depending on the considered seismotectonic region of the world. However, the human and economic losses when their hypocenters are located in the neighborhood of heavily populated and/or industrialized regions, can be very large, as recently observed for the 1985 Mw 8.01 Michoacan, Mexico and the 2011 Mw 9 Tohoku, Japan, earthquakes. Herewith, a methodology is proposed in order to estimate the probability of exceedance of: the intensities of extreme magnitude earthquakes, PEI and of their direct economical consequences PEDEC. The PEI are obtained by using supercomputing facilities to generate samples of the 3D propagation of extreme earthquake plausible scenarios, and enlarge those samples by Monte Carlo simulation. The PEDEC are computed by using appropriate vulnerability functions combined with the scenario intensity samples, and Monte Carlo simulation. An example of the application of the methodology due to the potential occurrence of extreme Mw 8.5 subduction earthquakes on Mexico City is presented.
NASA Astrophysics Data System (ADS)
Nozu, A.
2013-12-01
A new simplified source model is proposed to explain strong ground motions from a mega-thrust earthquake. The proposed model is simpler, and involves less model parameters, than the conventional characterized source model, which itself is a simplified expression of actual earthquake source. In the proposed model, the spacio-temporal distribution of slip within a subevent is not modeled. Instead, the source spectrum associated with the rupture of a subevent is modeled and it is assumed to follow the omega-square model. By multiplying the source spectrum with the path effect and the site amplification factor, the Fourier amplitude at a target site can be obtained. Then, combining it with Fourier phase characteristics of a smaller event, the time history of strong ground motions from the subevent can be calculated. Finally, by summing up contributions from the subevents, strong ground motions from the entire rupture can be obtained. The source model consists of six parameters for each subevent, namely, longitude, latitude, depth, rupture time, seismic moment and corner frequency of the subevent. Finite size of the subevent can be taken into account in the model, because the corner frequency of the subevent is included in the model, which is inversely proportional to the length of the subevent. Thus, the proposed model is referred to as the 'pseudo point-source model'. To examine the applicability of the model, a pseudo point-source model was developed for the 2011 Tohoku earthquake. The model comprises nine subevents, located off Miyagi Prefecture through Ibaraki Prefecture. The velocity waveforms (0.2-1 Hz), the velocity envelopes (0.2-10 Hz) and the Fourier spectra (0.2-10 Hz) at 15 sites calculated with the pseudo point-source model agree well with the observed ones, indicating the applicability of the model. Then the results were compared with the results of a super-asperity (SPGA) model of the same earthquake (Nozu, 2012, AGU), which can be considered as an
Evaluation of heat engine for hybrid vehicle application
NASA Technical Reports Server (NTRS)
Schneider, H. W.
1984-01-01
The status of ongoing heat-engine developments, including spark-ignition, compression-ignition, internal-combustion, and external-combustion engines is presented. The potential of engine concepts under consideration for hybrid vehicle use is evaluated, using self-imposed criteria for selection. The deficiencies of the engines currently being evaluated in hybrid vehicles are discussed. Focus is on recent research with two-stroke, rotary, and free-piston engines. It is concluded that these engine concepts have the most promising potential for future application in hybrid vehicles. Recommendations are made for analysis and experimentation to evaluate stop-start and transient emission behavior of recommended engine concepts.
NASA Astrophysics Data System (ADS)
Schaefer, A. M.; Daniell, J. E.; Wenzel, F.
2014-12-01
Earthquake clustering tends to be an increasingly important part of general earthquake research especially in terms of seismic hazard assessment and earthquake forecasting and prediction approaches. The distinct identification and definition of foreshocks, aftershocks, mainshocks and secondary mainshocks is taken into account using a point based spatio-temporal clustering algorithm originating from the field of classic machine learning. This can be further applied for declustering purposes to separate background seismicity from triggered seismicity. The results are interpreted and processed to assemble 3D-(x,y,t) earthquake clustering maps which are based on smoothed seismicity records in space and time. In addition, multi-dimensional Gaussian functions are used to capture clustering parameters for spatial distribution and dominant orientations. Clusters are further processed using methodologies originating from geostatistics, which have been mostly applied and developed in mining projects during the last decades. A 2.5D variogram analysis is applied to identify spatio-temporal homogeneity in terms of earthquake density and energy output. The results are mitigated using Kriging to provide an accurate mapping solution for clustering features. As a case study, seismic data of New Zealand and the United States is used, covering events since the 1950s, from which an earthquake cluster catalogue is assembled for most of the major events, including a detailed analysis of the Landers and Christchurch sequences.
Urban Earthquake Shaking and Loss Assessment
NASA Astrophysics Data System (ADS)
Hancilar, U.; Tuzun, C.; Yenidogan, C.; Zulfikar, C.; Durukal, E.; Erdik, M.
2009-04-01
of accelerometric and other macroseismic data (similar to the USGS ShakeMap System). The building inventory data for the Level 2 analysis will consist of grid (geo-cell) based urban building and demographic inventories. For building grouping the European building typology developed within the EU-FP5 RISK-EU project is used. The building vulnerability/fragility relationships to be used can be user selected from a list of applicable relationships developed on the basis of a comprehensive study, Both empirical and analytical relationships (based on the Coefficient Method, Equivalent Linearization Method and the Reduction Factor Method of analysis) can be employed. Casualties in Level 2 analysis are estimated based on the number of buildings in different damaged states and the casualty rates for each building type and damage level. Modifications to the casualty rates can be used if necessary. ELER Level 2 analysis will include calculation of direct monetary losses as a result building damage that will allow for repair-cost estimations and specific investigations associated with earthquake insurance applications (PML and AAL estimations). ELER Level 2 analysis loss results obtained for Istanbul for a scenario earthquake using different techniques will be presented with comparisons using different earthquake damage assessment software. The urban earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation, related Monte-Carlo type simulations and eathquake insurance applications.
Study of small turbofan engines applicable to single-engine light airplanes. Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merrill, G.L.
1976-09-01
The design, efficiency and cost factors are investigated for application of turbofan propulsion engines to single engine, general aviation light airplanes. A companion study of a hypothetical engine family of a thrust range suitable to such aircraft and having a high degree of commonality of design features and parts is presented. Future turbofan powered light airplanes can have a lower fuel consumption, lower weight, reduced airframe maintenance requirements and improved engine overhaul periods as compared to current piston engined powered airplanes. Achievement of compliance with noise and chemical emission regulations is expected without impairing performance, operating cost or safety.
Monitoring the Earthquake source process in North America
Herrmann, Robert B.; Benz, H.; Ammon, C.J.
2011-01-01
With the implementation of the USGS National Earthquake Information Center Prompt Assessment of Global Earthquakes for Response system (PAGER), rapid determination of earthquake moment magnitude is essential, especially for earthquakes that are felt within the contiguous United States. We report an implementation of moment tensor processing for application to broad, seismically active areas of North America. This effort focuses on the selection of regional crustal velocity models, codification of data quality tests, and the development of procedures for rapid computation of the seismic moment tensor. We systematically apply these techniques to earthquakes with reported magnitude greater than 3.5 in continental North America that are not associated with a tectonic plate boundary. Using the 0.02-0.10 Hz passband, we can usually determine, with few exceptions, moment tensor solutions for earthquakes with M w as small as 3.7. The threshold is significantly influenced by the density of stations, the location of the earthquake relative to the seismic stations and, of course, the signal-to-noise ratio. With the existing permanent broadband stations in North America operated for rapid earthquake response, the seismic moment tensor of most earthquakes that are M w 4 or larger can be routinely computed. As expected the nonuniform spatial pattern of these solutions reflects the seismicity pattern. However, the orientation of the direction of maximum compressive stress and the predominant style of faulting is spatially coherent across large regions of the continent.
Universal Recurrence Time Statistics of Characteristic Earthquakes
NASA Astrophysics Data System (ADS)
Goltz, C.; Turcotte, D. L.; Abaimov, S.; Nadeau, R. M.
2006-12-01
Characteristic earthquakes are defined to occur quasi-periodically on major faults. Do recurrence time statistics of such earthquakes follow a particular statistical distribution? If so, which one? The answer is fundamental and has important implications for hazard assessment. The problem cannot be solved by comparing the goodness of statistical fits as the available sequences are too short. The Parkfield sequence of M ≍ 6 earthquakes, one of the most extensive reliable data sets available, has grown to merely seven events with the last earthquake in 2004, for example. Recently, however, advances in seismological monitoring and improved processing methods have unveiled so-called micro-repeaters, micro-earthquakes which recur exactly in the same location on a fault. It seems plausible to regard these earthquakes as a miniature version of the classic characteristic earthquakes. Micro-repeaters are much more frequent than major earthquakes, leading to longer sequences for analysis. Due to their recent discovery, however, available sequences contain less than 20 events at present. In this paper we present results for the analysis of recurrence times for several micro-repeater sequences from Parkfield and adjacent regions. To improve the statistical significance of our findings, we combine several sequences into one by rescaling the individual sets by their respective mean recurrence intervals and Weibull exponents. This novel approach of rescaled combination yields the most extensive data set possible. We find that the resulting statistics can be fitted well by an exponential distribution, confirming the universal applicability of the Weibull distribution to characteristic earthquakes. A similar result is obtained from rescaled combination, however, with regard to the lognormal distribution.
Applications of smart materials in structural engineering.
DOT National Transportation Integrated Search
2003-10-01
With the development of materials and technology, many new materials find their applications in civil engineering to deal with the deteriorating infrastructure. Smart material is a promising example that deserves a wide focus, from research to applic...
Advances in polymeric systems for tissue engineering and biomedical applications.
Ravichandran, Rajeswari; Sundarrajan, Subramanian; Venugopal, Jayarama Reddy; Mukherjee, Shayanti; Ramakrishna, Seeram
2012-03-01
The characteristics of tissue engineered scaffolds are major concerns in the quest to fabricate ideal scaffolds for tissue engineering applications. The polymer scaffolds employed for tissue engineering applications should possess multifunctional properties such as biocompatibility, biodegradability and favorable mechanical properties as it comes in direct contact with the body fluids in vivo. Additionally, the polymer system should also possess biomimetic architecture and should support stem cell adhesion, proliferation and differentiation. As the progress in polymer technology continues, polymeric biomaterials have taken characteristics more closely related to that desired for tissue engineering and clinical needs. Stimuli responsive polymers also termed as smart biomaterials respond to stimuli such as pH, temperature, enzyme, antigen, glucose and electrical stimuli that are inherently present in living systems. This review highlights the exciting advancements in these polymeric systems that relate to biological and tissue engineering applications. Additionally, several aspects of technology namely scaffold fabrication methods and surface modifications to confer biological functionality to the polymers have also been discussed. The ultimate objective is to emphasize on these underutilized adaptive behaviors of the polymers so that novel applications and new generations of smart polymeric materials can be realized for biomedical and tissue engineering applications. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Operational earthquake forecasting can enhance earthquake preparedness
Jordan, T.H.; Marzocchi, W.; Michael, A.J.; Gerstenberger, M.C.
2014-01-01
We cannot yet predict large earthquakes in the short term with much reliability and skill, but the strong clustering exhibited in seismic sequences tells us that earthquake probabilities are not constant in time; they generally rise and fall over periods of days to years in correlation with nearby seismic activity. Operational earthquake forecasting (OEF) is the dissemination of authoritative information about these time‐dependent probabilities to help communities prepare for potentially destructive earthquakes. The goal of OEF is to inform the decisions that people and organizations must continually make to mitigate seismic risk and prepare for potentially destructive earthquakes on time scales from days to decades. To fulfill this role, OEF must provide a complete description of the seismic hazard—ground‐motion exceedance probabilities as well as short‐term rupture probabilities—in concert with the long‐term forecasts of probabilistic seismic‐hazard analysis (PSHA).
Characterization of intermittency in renewal processes: Application to earthquakes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Akimoto, Takuma; Hasumi, Tomohiro; Aizawa, Yoji
2010-03-15
We construct a one-dimensional piecewise linear intermittent map from the interevent time distribution for a given renewal process. Then, we characterize intermittency by the asymptotic behavior near the indifferent fixed point in the piecewise linear intermittent map. Thus, we provide a framework to understand a unified characterization of intermittency and also present the Lyapunov exponent for renewal processes. This method is applied to the occurrence of earthquakes using the Japan Meteorological Agency and the National Earthquake Information Center catalog. By analyzing the return map of interevent times, we find that interevent times are not independent and identically distributed random variablesmore » but that the conditional probability distribution functions in the tail obey the Weibull distribution.« less
Incorporating unnatural amino acids to engineer biocatalysts for industrial bioprocess applications.
Ravikumar, Yuvaraj; Nadarajan, Saravanan Prabhu; Hyeon Yoo, Tae; Lee, Chong-Soon; Yun, Hyungdon
2015-12-01
The bioprocess engineering with biocatalysts broadly spans its development and actual application of enzymes in an industrial context. Recently, both the use of bioprocess engineering and the development and employment of enzyme engineering techniques have been increasing rapidly. Importantly, engineering techniques that incorporate unnatural amino acids (UAAs) in vivo has begun to produce enzymes with greater stability and altered catalytic properties. Despite the growth of this technique, its potential value in bioprocess applications remains to be fully exploited. In this review, we explore the methodologies involved in UAA incorporation as well as ways to synthesize these UAAs. In addition, we summarize recent efforts to increase the yield of UAA engineered proteins in Escherichia coli and also the application of this tool in enzyme engineering. Furthermore, this protein engineering tool based on the incorporation of UAA can be used to develop immobilized enzymes that are ideal for bioprocess applications. Considering the potential of this tool and by exploiting these engineered enzymes, we expect the field of bioprocess engineering to open up new opportunities for biocatalysis in the near future. Copyright © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A Hybrid Ground-Motion Prediction Equation for Earthquakes in Western Alberta
NASA Astrophysics Data System (ADS)
Spriggs, N.; Yenier, E.; Law, A.; Moores, A. O.
2015-12-01
Estimation of ground-motion amplitudes that may be produced by future earthquakes constitutes the foundation of seismic hazard assessment and earthquake-resistant structural design. This is typically done by using a prediction equation that quantifies amplitudes as a function of key seismological variables such as magnitude, distance and site condition. In this study, we develop a hybrid empirical prediction equation for earthquakes in western Alberta, where evaluation of seismic hazard associated with induced seismicity is of particular interest. We use peak ground motions and response spectra from recorded seismic events to model the regional source and attenuation attributes. The available empirical data is limited in the magnitude range of engineering interest (M>4). Therefore, we combine empirical data with a simulation-based model in order to obtain seismologically informed predictions for moderate-to-large magnitude events. The methodology is two-fold. First, we investigate the shape of geometrical spreading in Alberta. We supplement the seismic data with ground motions obtained from mining/quarry blasts, in order to gain insights into the regional attenuation over a wide distance range. A comparison of ground-motion amplitudes for earthquakes and mining/quarry blasts show that both event types decay at similar rates with distance and demonstrate a significant Moho-bounce effect. In the second stage, we calibrate the source and attenuation parameters of a simulation-based prediction equation to match the available amplitude data from seismic events. We model the geometrical spreading using a trilinear function with attenuation rates obtained from the first stage, and calculate coefficients of anelastic attenuation and site amplification via regression analysis. This provides a hybrid ground-motion prediction equation that is calibrated for observed motions in western Alberta and is applicable to moderate-to-large magnitude events.
Object-oriented microcomputer software for earthquake seismology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kroeger, G.C.
1993-02-01
A suite of graphically interactive applications for the retrieval, editing and modeling of earthquake seismograms have been developed using object-orientation programming methodology and the C++ language. Retriever is an application which allows the user to search for, browse, and extract seismic data from CD-ROMs produced by the National Earthquake Information Center (NEIC). The user can restrict the date, size, location and depth of desired earthquakes and extract selected data into a variety of common seismic file formats. Reformer is an application that allows the user to edit seismic data and data headers, and perform a variety of signal processing operationsmore » on that data. Synthesizer is a program for the generation and analysis of teleseismic P and SH synthetic seismograms. The program provides graphical manipulation of source parameters, crustal structures and seismograms, as well as near real-time response in generating synthetics for arbitrary flat-layered crustal structures. All three applications use class libraries developed for implementing geologic and seismic objects and views. Standard seismogram view objects and objects that encapsulate the reading and writing of different seismic data file formats are shared by all three applications. The focal mechanism views in Synthesizer are based on a generic stereonet view object. Interaction with the native graphical user interface is encapsulated in a class library in order to simplify the porting of the software to different operating systems and application programming interfaces. The software was developed on the Apple Macintosh and is being ported to UNIX/X-Window platforms.« less
New ideas about the physics of earthquakes
NASA Astrophysics Data System (ADS)
Rundle, John B.; Klein, William
1995-07-01
It may be no exaggeration to claim that this most recent quaddrenium has seen more controversy and thus more progress in understanding the physics of earthquakes than any in recent memory. The most interesting development has clearly been the emergence of a large community of condensed matter physicists around the world who have begun working on the problem of earthquake physics. These scientists bring to the study of earthquakes an entirely new viewpoint, grounded in the physics of nucleation and critical phenomena in thermal, magnetic, and other systems. Moreover, a surprising technology transfer from geophysics to other fields has been made possible by the realization that models originally proposed to explain self-organization in earthquakes can also be used to explain similar processes in problems as disparate as brain dynamics in neurobiology (Hopfield, 1994), and charge density waves in solids (Brown and Gruner, 1994). An entirely new sub-discipline is emerging that is focused around the development and analysis of large scale numerical simulations of the dynamics of faults. At the same time, intriguing new laboratory and field data, together with insightful physical reasoning, has led to significant advances in our understanding of earthquake source physics. As a consequence, we can anticipate substantial improvement in our ability to understand the nature of earthquake occurrence. Moreover, while much research in the area of earthquake physics is fundamental in character, the results have many potential applications (Cornell et al., 1993) in the areas of earthquake risk and hazard analysis, and seismic zonation.
46 CFR 113.35-15 - Mechanical engine order telegraph systems; application.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 46 Shipping 4 2010-10-01 2010-10-01 false Mechanical engine order telegraph systems; application...-15 Mechanical engine order telegraph systems; application. If a mechanical engine order telegraph... cables or other mechanical limitations must not prevent the efficient operation of the system. ...
Report on the 2010 Chilean earthquake and tsunami response
,
2011-01-01
In July 2010, in an effort to reduce future catastrophic natural disaster losses for California, the American Red Cross coordinated and sent a delegation of 20 multidisciplinary experts on earthquake response and recovery to Chile. The primary goal was to understand how the Chilean society and relevant organizations responded to the magnitude 8.8 Maule earthquake that struck the region on February 27, 2010, as well as how an application of these lessons could better prepare California communities, response partners and state emergency partners for a comparable situation. Similarities in building codes, socioeconomic conditions, and broad extent of the strong shaking make the Chilean earthquake a very close analog to the impact of future great earthquakes on California. To withstand and recover from natural and human-caused disasters, it is essential for citizens and communities to work together to anticipate threats, limit effects, and rapidly restore functionality after a crisis. The delegation was hosted by the Chilean Red Cross and received extensive briefings from both national and local Red Cross officials. During nine days in Chile, the delegation also met with officials at the national, regional, and local government levels. Technical briefings were received from the President’s Emergency Committee, emergency managers from ONEMI (comparable to FEMA), structural engineers, a seismologist, hospital administrators, firefighters, and the United Nations team in Chile. Cities visited include Santiago, Talca, Constitución, Concepción, Talcahuano, Tumbes, and Cauquenes. The American Red Cross Multidisciplinary Team consisted of subject matter experts, who carried out special investigations in five Teams on the (1) science and engineering findings, (2) medical services, (3) emergency services, (4) volunteer management, and (5) executive and management issues (see appendix A for a full list of participants and their titles and teams). While developing this
Applications of yeast surface display for protein engineering
Cherf, Gerald M.; Cochran, Jennifer R.
2015-01-01
The method of displaying recombinant proteins on the surface of Saccharomyces cerevisiae via genetic fusion to an abundant cell wall protein, a technology known as yeast surface display, or simply, yeast display, has become a valuable protein engineering tool for a broad spectrum of biotechnology and biomedical applications. This review focuses on the use of yeast display for engineering protein affinity, stability, and enzymatic activity. Strategies and examples for each protein engineering goal are discussed. Additional applications of yeast display are also briefly presented, including protein epitope mapping, identification of protein-protein interactions, and uses of displayed proteins in industry and medicine. PMID:26060074
Engineering noble metal nanomaterials for environmental applications
NASA Astrophysics Data System (ADS)
Li, Jingguo; Zhao, Tingting; Chen, Tiankai; Liu, Yanbiao; Ong, Choon Nam; Xie, Jianping
2015-04-01
Besides being valuable assets in our daily lives, noble metals (namely, gold, silver, and platinum) also feature many intriguing physical and chemical properties when their sizes are reduced to the nano- or even subnano-scale; such assets may significantly increase the values of the noble metals as functional materials for tackling important societal issues related to human health and the environment. Among which, designing/engineering of noble metal nanomaterials (NMNs) to address challenging issues in the environment has attracted recent interest in the community. In general, the use of NMNs for environmental applications is highly dependent on the physical and chemical properties of NMNs. Such properties can be readily controlled by tailoring the attributes of NMNs, including their size, shape, composition, and surface. In this feature article, we discuss recent progress in the rational design and engineering of NMNs with particular focus on their applications in the field of environmental sensing and catalysis. The development of functional NMNs for environmental applications is highly interdisciplinary, which requires concerted efforts from the communities of materials science, chemistry, engineering, and environmental science.
Engineering noble metal nanomaterials for environmental applications.
Li, Jingguo; Zhao, Tingting; Chen, Tiankai; Liu, Yanbiao; Ong, Choon Nam; Xie, Jianping
2015-05-07
Besides being valuable assets in our daily lives, noble metals (namely, gold, silver, and platinum) also feature many intriguing physical and chemical properties when their sizes are reduced to the nano- or even subnano-scale; such assets may significantly increase the values of the noble metals as functional materials for tackling important societal issues related to human health and the environment. Among which, designing/engineering of noble metal nanomaterials (NMNs) to address challenging issues in the environment has attracted recent interest in the community. In general, the use of NMNs for environmental applications is highly dependent on the physical and chemical properties of NMNs. Such properties can be readily controlled by tailoring the attributes of NMNs, including their size, shape, composition, and surface. In this feature article, we discuss recent progress in the rational design and engineering of NMNs with particular focus on their applications in the field of environmental sensing and catalysis. The development of functional NMNs for environmental applications is highly interdisciplinary, which requires concerted efforts from the communities of materials science, chemistry, engineering, and environmental science.
NASA Astrophysics Data System (ADS)
Sadeghi, H.
2015-12-01
Bridges are major elements of infrastructure in all societies. Their safety and continued serviceability guaranties the transportation and emergency access in urban and rural areas. However, these important structures are subject to earthquake induced damages in structure and foundations. The basic approach to the proper support of foundations are a) distribution of imposed loads to foundation in a way they can resist those loads without excessive settlement and failure; b) modification of foundation ground with various available methods; and c) combination of "a" and "b". The engineers has to face the task of designing the foundations meeting all safely and serviceability criteria but sometimes when there are numerous environmental and financial constrains, the use of some traditional methods become inevitable. This paper explains the application of timber piles to improve ground resistance to liquefaction and to secure the abutments of short to medium length bridges in an earthquake/liquefaction prone area in Bohol Island, Philippines. The limitations of using the common ground improvement methods (i.e., injection, dynamic compaction) because of either environmental or financial concerns along with the abundance of timber in the area made the engineers to use a network of timber piles behind the backwalls of the bridge abutments. The suggested timber pile network is simulated by numerical methods and its safety is examined. The results show that the compaction caused by driving of the piles and bearing capacity provided by timbers reduce the settlement and lateral movements due to service and earthquake induced loads.
NASA Astrophysics Data System (ADS)
Ambroglini, Filippo; Jerome Burger, William; Battiston, Roberto; Vitale, Vincenzo; Zhang, Yu
2014-05-01
During last decades, few space experiments revealed anomalous bursts of charged particles, mainly electrons with energy larger than few MeV. A possible source of these bursts are the low-frequency seismo-electromagnetic emissions, which can cause the precipitation of the electrons from the lower boundary of their inner belt. Studies of these bursts reported also a short-term pre-seismic excess. Starting from simulation tools traditionally used on high energy physics we developed a dedicated application SEPS (Space Perturbation Earthquake Simulation), based on the Geant4 tool and PLANETOCOSMICS program, able to model and simulate the electromagnetic interaction between the earthquake and the particles trapped in the inner Van Allen belt. With SEPS one can study the transport of particles trapped in the Van Allen belts through the Earth's magnetic field also taking into account possible interactions with the Earth's atmosphere. SEPS provides the possibility of: testing different models of interaction between electromagnetic waves and trapped particles, defining the mechanism of interaction as also shaping the area in which this takes place,assessing the effects of perturbations in the magnetic field on the particles path, performing back-tracking analysis and also modelling the interaction with electric fields. SEPS is in advanced development stage, so that it could be already exploited to test in details the results of correlation analysis between particle bursts and earthquakes based on NOAA and SAMPEX data. The test was performed both with a full simulation analysis, (tracing from the position of the earthquake and going to see if there were paths compatible with the burst revealed) and with a back-tracking analysis (tracing from the burst detection point and checking the compatibility with the position of associated earthquake).
Photonics Applications and Web Engineering: WILGA 2017
NASA Astrophysics Data System (ADS)
Romaniuk, Ryszard S.
2017-08-01
XLth Wilga Summer 2017 Symposium on Photonics Applications and Web Engineering was held on 28 May-4 June 2017. The Symposium gathered over 350 participants, mainly young researchers active in optics, optoelectronics, photonics, modern optics, mechatronics, applied physics, electronics technologies and applications. There were presented around 300 oral and poster papers in a few main topical tracks, which are traditional for Wilga, including: bio-photonics, optical sensory networks, photonics-electronics-mechatronics co-design and integration, large functional system design and maintenance, Internet of Things, measurement systems for astronomy, high energy physics experiments, and other. The paper is a traditional introduction to the 2017 WILGA Summer Symposium Proceedings, and digests some of the Symposium chosen key presentations. This year Symposium was divided to the following topical sessions/conferences: Optics, Optoelectronics and Photonics, Computational and Artificial Intelligence, Biomedical Applications, Astronomical and High Energy Physics Experiments Applications, Material Research and Engineering, and Advanced Photonics and Electronics Applications in Research and Industry.
... across a fault to learn about past earthquakes. Science Fair Projects A GPS instrument measures slow movements of the ground. Become an Earthquake Scientist Cool Earthquake Facts Today in Earthquake History A scientist stands in ...
Microbiome engineering: Current applications and its future.
Foo, Jee Loon; Ling, Hua; Lee, Yung Seng; Chang, Matthew Wook
2017-03-01
Microbiomes exist in all ecosystems and are composed of diverse microbial communities. Perturbation to microbiomes brings about undesirable phenotypes in the hosts, resulting in diseases and disorders, and disturbs the balance of the associated ecosystems. Engineering of microbiomes can be used to modify structures of the microbiota and restore ecological balance. Consequently, microbiome engineering has been employed for improving human health and agricultural productivity. The importance and current applications of microbiome engineering, particularly in humans, animals, plants and soil is reviewed. Furthermore, we explore the challenges in engineering microbiome and the future of this field, thus providing perspectives and outlook of microbiome engineering. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
THE GREAT SOUTHERN CALIFORNIA SHAKEOUT: Earthquake Science for 22 Million People
NASA Astrophysics Data System (ADS)
Jones, L.; Cox, D.; Perry, S.; Hudnut, K.; Benthien, M.; Bwarie, J.; Vinci, M.; Buchanan, M.; Long, K.; Sinha, S.; Collins, L.
2008-12-01
Earthquake science is being communicated to and used by the 22 million residents of southern California to improve resiliency to future earthquakes through the Great Southern California ShakeOut. The ShakeOut began when the USGS partnered with the California Geological Survey, Southern California Earthquake Center and many other organizations to bring 300 scientists and engineers together to formulate a comprehensive description of a plausible major earthquake, released in May 2008, as the ShakeOut Scenario, a description of the impacts and consequences of a M7.8 earthquake on the Southern San Andreas Fault (USGS OFR2008-1150). The Great Southern California ShakeOut was a week of special events featuring the largest earthquake drill in United States history. The ShakeOut drill occurred in houses, businesses, and public spaces throughout southern California at 10AM on November 13, 2008, when southern Californians were asked to pretend that the M7.8 scenario earthquake had occurred and to practice actions that could reduce the impact on their lives. Residents, organizations, schools and businesses registered to participate in the drill through www.shakeout.org where they could get accessible information about the scenario earthquake and share ideas for better reparation. As of September 8, 2008, over 2.7 million confirmed participants had been registered. The primary message of the ShakeOut is that what we do now, before a big earthquake, will determine what our lives will be like after. The goal of the ShakeOut has been to change the culture of earthquake preparedness in southern California, making earthquakes a reality that are regularly discussed. This implements the sociological finding that 'milling,' discussing a problem with loved ones, is a prerequisite to taking action. ShakeOut milling is taking place at all levels from individuals and families, to corporations and governments. Actions taken as a result of the ShakeOut include the adoption of earthquake
Composite engines for application to a single-stage-to-orbit vehicle
NASA Technical Reports Server (NTRS)
Bendot, J. G.; Brown, P. N.; Piercy, T. G.
1975-01-01
Seven composite engines were designed for application to a reusable single-stage-to-orbit vehicle. The engine designs were variations of the supercharged ejector ramjet engine. The resulting performance, weight, and drawings of each engine form a data base for establishing a potential of this class of composite engine to various missions, including the single-stage-to-orbit application. The impact of advanced technology in the design of the critical fan turbine was established.
NASA Astrophysics Data System (ADS)
Li, Shuang; Yu, Xiaohui; Zhang, Yanjuan; Zhai, Changhai
2018-01-01
Casualty prediction in a building during earthquakes benefits to implement the economic loss estimation in the performance-based earthquake engineering methodology. Although after-earthquake observations reveal that the evacuation has effects on the quantity of occupant casualties during earthquakes, few current studies consider occupant movements in the building in casualty prediction procedures. To bridge this knowledge gap, a numerical simulation method using refined cellular automata model is presented, which can describe various occupant dynamic behaviors and building dimensions. The simulation on the occupant evacuation is verified by a recorded evacuation process from a school classroom in real-life 2013 Ya'an earthquake in China. The occupant casualties in the building under earthquakes are evaluated by coupling the building collapse process simulation by finite element method, the occupant evacuation simulation, and the casualty occurrence criteria with time and space synchronization. A case study of casualty prediction in a building during an earthquake is provided to demonstrate the effect of occupant movements on casualty prediction.
Seismic gaps and source zones of recent large earthquakes in coastal Peru
Dewey, J.W.; Spence, W.
1979-01-01
The earthquakes of central coastal Peru occur principally in two distinct zones of shallow earthquake activity that are inland of and parallel to the axis of the Peru Trench. The interface-thrust (IT) zone includes the great thrust-fault earthquakes of 17 October 1966 and 3 October 1974. The coastal-plate interior (CPI) zone includes the great earthquake of 31 May 1970, and is located about 50 km inland of and 30 km deeper than the interface thrust zone. The occurrence of a large earthquake in one zone may not relieve elastic strain in the adjoining zone, thus complicating the application of the seismic gap concept to central coastal Peru. However, recognition of two seismic zones may facilitate detection of seismicity precursory to a large earthquake in a given zone; removal of probable CPI-zone earthquakes from plots of seismicity prior to the 1974 main shock dramatically emphasizes the high seismic activity near the rupture zone of that earthquake in the five years preceding the main shock. Other conclusions on the seismicity of coastal Peru that affect the application of the seismic gap concept to this region are: (1) Aftershocks of the great earthquakes of 1966, 1970, and 1974 occurred in spatially separated clusters. Some clusters may represent distinct small source regions triggered by the main shock rather than delimiting the total extent of main-shock rupture. The uncertainty in the interpretation of aftershock clusters results in corresponding uncertainties in estimates of stress drop and estimates of the dimensions of the seismic gap that has been filled by a major earthquake. (2) Aftershocks of the great thrust-fault earthquakes of 1966 and 1974 generally did not extend seaward as far as the Peru Trench. (3) None of the three great earthquakes produced significant teleseismic activity in the following month in the source regions of the other two earthquakes. The earthquake hypocenters that form the basis of this study were relocated using station
Boxberger, Tobias; Fleming, Kevin; Pittore, Massimiliano; Parolai, Stefano; Pilz, Marco; Mikulla, Stefan
2017-10-20
The Multi-Parameter Wireless Sensing (MPwise) system is an innovative instrumental design that allows different sensor types to be combined with relatively high-performance computing and communications components. These units, which incorporate off-the-shelf components, can undertake complex information integration and processing tasks at the individual unit or node level (when used in a network), allowing the establishment of networks that are linked by advanced, robust and rapid communications routing and network topologies. The system (and its predecessors) was originally designed for earthquake risk mitigation, including earthquake early warning (EEW), rapid response actions, structural health monitoring, and site-effect characterization. For EEW, MPwise units are capable of on-site, decentralized, independent analysis of the recorded ground motion and based on this, may issue an appropriate warning, either by the unit itself or transmitted throughout a network by dedicated alarming procedures. The multi-sensor capabilities of the system allow it to be instrumented with standard strong- and weak-motion sensors, broadband sensors, MEMS (namely accelerometers), cameras, temperature and humidity sensors, and GNSS receivers. In this work, the MPwise hardware, software and communications schema are described, as well as an overview of its possible applications. While focusing on earthquake risk mitigation actions, the aim in the future is to expand its capabilities towards a more multi-hazard and risk mitigation role. Overall, MPwise offers considerable flexibility and has great potential in contributing to natural hazard risk mitigation.
Boxberger, Tobias; Fleming, Kevin; Pittore, Massimiliano; Parolai, Stefano; Pilz, Marco; Mikulla, Stefan
2017-01-01
The Multi-Parameter Wireless Sensing (MPwise) system is an innovative instrumental design that allows different sensor types to be combined with relatively high-performance computing and communications components. These units, which incorporate off-the-shelf components, can undertake complex information integration and processing tasks at the individual unit or node level (when used in a network), allowing the establishment of networks that are linked by advanced, robust and rapid communications routing and network topologies. The system (and its predecessors) was originally designed for earthquake risk mitigation, including earthquake early warning (EEW), rapid response actions, structural health monitoring, and site-effect characterization. For EEW, MPwise units are capable of on-site, decentralized, independent analysis of the recorded ground motion and based on this, may issue an appropriate warning, either by the unit itself or transmitted throughout a network by dedicated alarming procedures. The multi-sensor capabilities of the system allow it to be instrumented with standard strong- and weak-motion sensors, broadband sensors, MEMS (namely accelerometers), cameras, temperature and humidity sensors, and GNSS receivers. In this work, the MPwise hardware, software and communications schema are described, as well as an overview of its possible applications. While focusing on earthquake risk mitigation actions, the aim in the future is to expand its capabilities towards a more multi-hazard and risk mitigation role. Overall, MPwise offers considerable flexibility and has great potential in contributing to natural hazard risk mitigation. PMID:29053608
ENGINEERING APPLICATIONS OF ANALOG COMPUTERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryant, L.T.; Janicke, M.J.; Just, L.C.
1963-10-31
Six experiments from the fields of reactor engineering, heat transfer, and dynamics are presented to illustrate the engineering applications of analog computers. The steps required for producing the analog solution are shown, as well as complete information for duplicating the solution. Graphical results are provided. The experiments include: deceleration of a reactor control rod, pressure variations through a packed bed, reactor kinetics over many decades with thermal feedback, a vibrating system with two degrees of freedom, temperature distribution in a radiating fin, temperature distribution in an infinite slab considering variable thermal properties, and iodine -xenon buildup in a reactor. (M.C.G.)
Real-time earthquake shake, damage, and loss mapping for Istanbul metropolitan area
NASA Astrophysics Data System (ADS)
Zülfikar, A. Can; Fercan, N. Özge Zülfikar; Tunç, Süleyman; Erdik, Mustafa
2017-01-01
The past devastating earthquakes in densely populated urban centers, such as the 1994 Northridge; 1995 Kobe; 1999 series of Kocaeli, Düzce, and Athens; and 2011 Van-Erciş events, showed that substantial social and economic losses can be expected. Previous studies indicate that inadequate emergency response can increase the number of casualties by a maximum factor of 10, which suggests the need for research on rapid earthquake shaking damage and loss estimation. The reduction in casualties in urban areas immediately following an earthquake can be improved if the location and severity of damages can be rapidly assessed by information from rapid response systems. In this context, a research project (TUBITAK-109M734) titled "Real-time Information of Earthquake Shaking, Damage, and Losses for Target Cities of Thessaloniki and Istanbul" was conducted during 2011-2014 to establish the rapid estimation of ground motion shaking and related earthquake damages and casualties for the target cities. In the present study, application to Istanbul metropolitan area is presented. In order to fulfill this objective, earthquake hazard and risk assessment methodology known as Earthquake Loss Estimation Routine, which was developed for the Euro-Mediterranean region within the Network of Research Infrastructures for European Seismology EC-FP6 project, was used. The current application to the Istanbul metropolitan area provides real-time ground motion information obtained by strong motion stations distributed throughout the densely populated areas of the city. According to this ground motion information, building damage estimation is computed by using grid-based building inventory, and the related loss is then estimated. Through this application, the rapidly estimated information enables public and private emergency management authorities to take action and allocate and prioritize resources to minimize the casualties in urban areas during immediate post-earthquake periods. Moreover, it
Assessment of earthquake effects - contribution from online communication
NASA Astrophysics Data System (ADS)
D'Amico, Sebastiano; Agius, Matthew; Galea, Pauline
2014-05-01
The rapid increase of social media and online newspapers in the last years have given the opportunity to make a national investigation on macroseismic effects on the Maltese Islands based on felt earthquake reports. A magnitude 4.1 earthquake struck close to Malta on Sunday 24th April 2011 at 13:02 GMT. The earthquake was preceded and followed by a series of smaller magnitude quakes throughout the day, most of which were felt by the locals on the island. The continuous news media coverage during the day and the extensive sharing of the news item on social media resulted in a strong public response to fill in the 'Did you feel it?' online form on the website of the Seismic Monitoring and Research Unit (SMRU) at the University of Malta (http://seismic.research.um.edu.mt/). The results yield interesting information about the demographics of the island, and the different felt experiences possibly relating to geological settings and diverse structural and age-classified buildings. Based on this case study, the SMRU is in the process of developing a mobile phone application dedicated to share earthquake information to the local community. The application will automatically prompt users to fill in a simplified 'Did you feel it?' report to potentially felt earthquakes. Automatic location using Global Positioning Systems can be incorporated to provide a 'real time' intensity map that can be used by the Civil Protection Department.
Earthquake insurance pricing: a risk-based approach.
Lin, Jeng-Hsiang
2018-04-01
Flat earthquake premiums are 'uniformly' set for a variety of buildings in many countries, neglecting the fact that the risk of damage to buildings by earthquakes is based on a wide range of factors. How these factors influence the insurance premiums is worth being studied further. Proposed herein is a risk-based approach to estimate the earthquake insurance rates of buildings. Examples of application of the approach to buildings located in Taipei city of Taiwan were examined. Then, the earthquake insurance rates for the buildings investigated were calculated and tabulated. To fulfil insurance rating, the buildings were classified into 15 model building types according to their construction materials and building height. Seismic design levels were also considered in insurance rating in response to the effect of seismic zone and construction years of buildings. This paper may be of interest to insurers, actuaries, and private and public sectors of insurance. © 2018 The Author(s). Disasters © Overseas Development Institute, 2018.
Prospective testing of Coulomb short-term earthquake forecasts
NASA Astrophysics Data System (ADS)
Jackson, D. D.; Kagan, Y. Y.; Schorlemmer, D.; Zechar, J. D.; Wang, Q.; Wong, K.
2009-12-01
Earthquake induced Coulomb stresses, whether static or dynamic, suddenly change the probability of future earthquakes. Models to estimate stress and the resulting seismicity changes could help to illuminate earthquake physics and guide appropriate precautionary response. But do these models have improved forecasting power compared to empirical statistical models? The best answer lies in prospective testing in which a fully specified model, with no subsequent parameter adjustments, is evaluated against future earthquakes. The Center of Study of Earthquake Predictability (CSEP) facilitates such prospective testing of earthquake forecasts, including several short term forecasts. Formulating Coulomb stress models for formal testing involves several practical problems, mostly shared with other short-term models. First, earthquake probabilities must be calculated after each “perpetrator” earthquake but before the triggered earthquakes, or “victims”. The time interval between a perpetrator and its victims may be very short, as characterized by the Omori law for aftershocks. CSEP evaluates short term models daily, and allows daily updates of the models. However, lots can happen in a day. An alternative is to test and update models on the occurrence of each earthquake over a certain magnitude. To make such updates rapidly enough and to qualify as prospective, earthquake focal mechanisms, slip distributions, stress patterns, and earthquake probabilities would have to be made by computer without human intervention. This scheme would be more appropriate for evaluating scientific ideas, but it may be less useful for practical applications than daily updates. Second, triggered earthquakes are imperfectly recorded following larger events because their seismic waves are buried in the coda of the earlier event. To solve this problem, testing methods need to allow for “censoring” of early aftershock data, and a quantitative model for detection threshold as a function of
Recent Advances in Application of Biosensors in Tissue Engineering
Paul, Arghya; Lee, Yong-kyu; Jaffa, Ayad A.
2014-01-01
Biosensors research is a fast growing field in which tens of thousands of papers have been published over the years, and the industry is now worth billions of dollars. The biosensor products have found their applications in numerous industries including food and beverages, agricultural, environmental, medical diagnostics, and pharmaceutical industries and many more. Even though numerous biosensors have been developed for detection of proteins, peptides, enzymes, and numerous other biomolecules for diverse applications, their applications in tissue engineering have remained limited. In recent years, there has been a growing interest in application of novel biosensors in cell culture and tissue engineering, for example, real-time detection of small molecules such as glucose, lactose, and H2O2 as well as serum proteins of large molecular size, such as albumin and alpha-fetoprotein, and inflammatory cytokines, such as IFN-g and TNF-α. In this review, we provide an overview of the recent advancements in biosensors for tissue engineering applications. PMID:25165697
Recent advances in application of biosensors in tissue engineering.
Hasan, Anwarul; Nurunnabi, Md; Morshed, Mahboob; Paul, Arghya; Polini, Alessandro; Kuila, Tapas; Al Hariri, Moustafa; Lee, Yong-kyu; Jaffa, Ayad A
2014-01-01
Biosensors research is a fast growing field in which tens of thousands of papers have been published over the years, and the industry is now worth billions of dollars. The biosensor products have found their applications in numerous industries including food and beverages, agricultural, environmental, medical diagnostics, and pharmaceutical industries and many more. Even though numerous biosensors have been developed for detection of proteins, peptides, enzymes, and numerous other biomolecules for diverse applications, their applications in tissue engineering have remained limited. In recent years, there has been a growing interest in application of novel biosensors in cell culture and tissue engineering, for example, real-time detection of small molecules such as glucose, lactose, and H2O2 as well as serum proteins of large molecular size, such as albumin and alpha-fetoprotein, and inflammatory cytokines, such as IFN-g and TNF-α. In this review, we provide an overview of the recent advancements in biosensors for tissue engineering applications.
Study of small turbofan engines applicable to general-aviation aircraft
NASA Technical Reports Server (NTRS)
Merrill, G. L.; Burnett, G. A.; Alsworth, C. C.
1973-01-01
The applicability of small turbofan engines to general aviation aircraft is discussed. The engine and engine/airplane performance, weight, size, and cost interrelationships are examined. The effects of specific engine noise constraints are evaluated. The factors inhibiting the use of turbofan engines in general aviation aircraft are identified.
Chapter C. The Loma Prieta, California, Earthquake of October 17, 1989 - Building Structures
Çelebi, Mehmet
1998-01-01
Several approaches are used to assess the performance of the built environment following an earthquake -- preliminary damage surveys conducted by professionals, detailed studies of individual structures, and statistical analyses of groups of structures. Reports of damage that are issued by many organizations immediately following an earthquake play a key role in directing subsequent detailed investigations. Detailed studies of individual structures and statistical analyses of groups of structures may be motivated by particularly good or bad performance during an earthquake. Beyond this, practicing engineers typically perform stress analyses to assess the performance of a particular structure to vibrational levels experienced during an earthquake. The levels may be determined from recorded or estimated ground motions; actual levels usually differ from design levels. If a structure has seismic instrumentation to record response data, the estimated and recorded response and behavior of the structure can be compared.
Hough, Susan E.
2013-01-01
The occurrence of three earthquakes with moment magnitude (Mw) greater than 8.8 and six earthquakes larger than Mw 8.5, since 2004, has raised interest in the long-term global rate of great earthquakes. Past studies have focused on the analysis of earthquakes since 1900, which roughly marks the start of the instrumental era in seismology. Before this time, the catalog is less complete and magnitude estimates are more uncertain. Yet substantial information is available for earthquakes before 1900, and the catalog of historical events is being used increasingly to improve hazard assessment. Here I consider the catalog of historical earthquakes and show that approximately half of all Mw ≥ 8.5 earthquakes are likely missing or underestimated in the 19th century. I further present a reconsideration of the felt effects of the 8 February 1843, Lesser Antilles earthquake, including a first thorough assessment of felt reports from the United States, and show it is an example of a known historical earthquake that was significantly larger than initially estimated. The results suggest that incorporation of best available catalogs of historical earthquakes will likely lead to a significant underestimation of seismic hazard and/or the maximum possible magnitude in many regions, including parts of the Caribbean.
2013-02-01
Kathmandu, Nepal 5a. CONTRACT NUMBER W911NF-12-1-0282 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d. PROJECT NUMBER 5e. TASK NUMBER ...In the past, big earthquakes in Nepal (see Figure 1.1) have caused a huge number of casualties and damage to structures. The Great Nepal -Bihar...UBC Earthquake Engineering Research Facility 2235 East Mall, Vancouver, BC, Canada V6T 1Z4 Phone : 604 822-6203 Fax: 604 822-6901 E-mail
Losses to single-family housing from ground motions in the 1994 Northridge, California, earthquake
Wesson, R.L.; Perkins, D.M.; Leyendecker, E.V.; Roth, R.J.; Petersen, M.D.
2004-01-01
The distributions of insured losses to single-family housing following the 1994 Northridge, California, earthquake for 234 ZIP codes can be satisfactorily modeled with gamma distributions. Regressions of the parameters in the gamma distribution on estimates of ground motion, derived from ShakeMap estimates or from interpolated observations, provide a basis for developing curves of conditional probability of loss given a ground motion. Comparison of the resulting estimates of aggregate loss with the actual aggregate loss gives satisfactory agreement for several different ground-motion parameters. Estimates of loss based on a deterministic spatial model of the earthquake ground motion, using standard attenuation relationships and NEHRP soil factors, give satisfactory results for some ground-motion parameters if the input ground motions are increased about one and one-half standard deviations above the median, reflecting the fact that the ground motions for the Northridge earthquake tended to be higher than the median ground motion for other earthquakes with similar magnitude. The results give promise for making estimates of insured losses to a similar building stock under future earthquake loading. ?? 2004, Earthquake Engineering Research Institute.
Surface-Wave Relocation of Remote Continental Earthquakes
NASA Astrophysics Data System (ADS)
Kintner, J. A.; Ammon, C. J.; Cleveland, M.
2017-12-01
Accurate hypocenter locations are essential for seismic event analysis. Single-event location estimation methods provide relatively imprecise results in remote regions with few nearby seismic stations. Previous work has demonstrated that improved relative epicentroid precision in oceanic environments is obtainable using surface-wave cross correlation measurements. We use intermediate-period regional and teleseismic Rayleigh and Love waves to estimate relative epicentroid locations of moderately-sized seismic events in regions around Iran. Variations in faulting geometry, depth, and intermediate-period dispersion make surface-wave based event relocation challenging across this broad continental region. We compare and integrate surface-wave based relative locations with InSAR centroid location estimates. However, mapping an earthquake sequence mainshock to an InSAR fault deformation model centroid is not always a simple process, since the InSAR observations are sensitive to post-seismic deformation. We explore these ideas using earthquake sequences in western Iran. We also apply surface-wave relocation to smaller magnitude earthquakes (3.5 < M < 5.0). Inclusion of smaller-magnitude seismic events in a relocation effort requires a shift in bandwidth to shorter periods, which increases the sensitivity of relocations to surface-wave dispersion. Frequency-domain inter-event phase observations are used to understand the time-domain cross-correlation information, and to choose the appropriate band for applications using shorter periods. Over short inter-event distances, the changing group velocity does not strongly degrade the relative locations. For small-magnitude seismic events in continental regions, surface-wave relocation does not appear simple enough to allow broad routine application, but using this method to analyze individual earthquake sequences can provide valuable insight into earthquake and faulting processes.
Liu, Ya-hua; Yang, Hui-ning; Liu, Hui-liang; Wang, Fan; Hu, Li-bin; Zheng, Jing-chen
2013-05-01
To summarize and analyze the medical mission of China National Earthquake Disaster Emergency Search and Rescue Team (CNESAR) in Lushan earthquake, to promote the medical rescue effectiveness incorporated with search and rescue. Retrospective analysis of medical work data by CNESAR from April 21th, 2013 to April 27th during Lushan earthquake rescue, including the medical staff dispatch and the wounded case been treated. The reasonable medical corps was composed by 22 members, including 2 administrators, 11 doctors [covering emergency medicine, orthopedics (joints and limbs, spinal), obstetrics and gynecology, gastroenterology, cardiology, ophthalmology, anesthesiology, medical rescue, health epidemic prevention, clinical laboratory of 11 specialties], 1 ultrasound technician, 5 nurses, 1 pharmacist, 1 medical instrument engineer and 1 office worker for propaganda. There were two members having psychological consultants qualifications. The medical work were carried out in seven aspects, including medical care assurance for the CNESAR members, first aid cooperation with search and rescue on site, clinical work in refugees' camp, medical round service for scattered village people, evacuation for the wounded, mental intervention, and the sanitary and anti-epidemic work. The medical work covered 24 small towns, and medical staff established 3 medical clinics at Taiping Town, Shuangshi Town of Lushan County and Baoxing County. Medical rescue, mental intervention for the old and kids, and sanitary and anti-epidemic were performed at the above sites. The medical corps had successful evacuated 2 severe wounded patients and treated the wounded over thousands. Most of the wounded were soft tissue injuries, external injury, respiratory tract infections, diarrhea, and heat stroke. Compared with the rescue action in 2008 Wenchuan earthquake, the aggregation and departure of rescue team in Lushan earthquake, the traffic control order in disaster area, the self-aid and buddy aid
The common engine concept for ALS application - A cost reduction approach
NASA Technical Reports Server (NTRS)
Bair, E. K.; Schindler, C. M.
1989-01-01
Future launch systems require the application of propulsion systems which have been designed and developed to meet mission model needs while providing high degrees of reliability and cost effectiveness. Vehicle configurations which utilize different propellant combinations for booster and core stages can benefit from a common engine approach where a single engine design can be configured to operate on either set of propellants and thus serve as either a booster or core engine. Engine design concepts and mission application for a vehicle employing a common engine are discussed. Engine program cost estimates were made and cost savings, over the design and development of two unique engines, estimated.
Engineering aspects of seismological studies in Peru
Ocola, L.
1982-01-01
In retrospect, the Peruvian national long-range earthquake-study program began after the catastrophic earthquake of May 31, 1970. This earthquake triggered a large snow avalanche from Huascaran mountain, killing over 60,000 people, and covering with mud small cities and tens of villages in the Andean valley of Callejon de Huaylas, Huaraz. Since then, great efforts have been made to learn about the natural seismic environment and its engineering and social aspects. The Organization of American States (OAS)has been one of the most important agencies in the development of the program.
Reduction of earthquake risk in the united states: Bridging the gap between research and practice
Hays, W.W.
1998-01-01
Continuing efforts under the auspices of the National Earthquake Hazards Reduction Program are under way to improve earthquake risk assessment and risk management in earthquake-prone regions of Alaska, California, Nevada, Washington, Oregon, Arizona, Utah, Wyoming, and Idaho, the New Madrid and Wabash Valley seismic zones in the central United States, the southeastern and northeastern United States, Puerto Rico, Virgin Islands, Guam, and Hawaii. Geologists, geophysicists, seismologists, architects, engineers, urban planners, emergency managers, health care specialists, and policymakers are having to work at the margins of their disciplines to bridge the gap between research and practice and to provide a social, technical, administrative, political, legal, and economic basis for changing public policies and professional practices in communities where the earthquake risk is unacceptable. ?? 1998 IEEE.
Making the Handoff from Earthquake Hazard Assessments to Effective Mitigation Measures (Invited)
NASA Astrophysics Data System (ADS)
Applegate, D.
2010-12-01
This year has witnessed a barrage of large earthquakes worldwide with the resulting damages ranging from inconsequential to truly catastrophic. We cannot predict when earthquakes will strike, but we can build communities that are resilient to strong shaking as well as to secondary hazards such as landslides and liquefaction. The contrasting impacts of the magnitude-7 earthquake that struck Haiti in January and the magnitude-8.8 event that struck Chile in April underscore the difference that mitigation and preparedness can make. In both cases, millions of people were exposed to severe shaking, but deaths in Chile were measured in the hundreds rather than the hundreds of thousands that perished in Haiti. Numerous factors contributed to these disparate outcomes, but the most significant is the presence of strong building codes in Chile and their total absence in Haiti. The financial cost of the Chilean earthquake still represents an unacceptably high percentage of that nation’s gross domestic product, a reminder that life safety is the paramount, but not the only, goal of disaster risk reduction measures. For building codes to be effective, both in terms of lives saved and economic cost, they need to reflect the hazard as accurately as possible. As one of four federal agencies that make up the congressionally mandated National Earthquake Hazards Reduction Program (NEHRP), the U.S. Geological Survey (USGS) develops national seismic hazard maps that form the basis for seismic provisions in model building codes through the Federal Emergency Management Agency and private-sector practitioners. This cooperation is central to NEHRP, which both fosters earthquake research and establishes pathways to translate research results into implementation measures. That translation depends on the ability of hazard-focused scientists to interact and develop mutual trust with risk-focused engineers and planners. Strengthening that interaction is an opportunity for the next generation
Earthquake and ambient vibration monitoring of the steel-frame UCLA factor building
Kohler, M.D.; Davis, P.M.; Safak, E.
2005-01-01
Dynamic property measurements of the moment-resisting steel-frame University of California, Los Angeles, Factor building are being made to assess how forces are distributed over the building. Fourier amplitude spectra have been calculated from several intervals of ambient vibrations, a 24-hour period of strong winds, and from the 28 March 2003 Encino, California (ML = 2.9), the 3 September 2002 Yorba Linda, California (ML = 4.7), and the 3 November 2002 Central Alaska (Mw = 7.9) earthquakes. Measurements made from the ambient vibration records show that the first-mode frequency of horizontal vibration is between 0.55 and 0.6 Hz. The second horizontal mode has a frequency between 1.6 and 1.9 Hz. In contrast, the first-mode frequencies measured from earthquake data are about 0.05 to 0.1 Hz lower than those corresponding to ambient vibration recordings indicating softening of the soil-structure system as amplitudes become larger. The frequencies revert to pre-earthquake levels within five minutes of the Yorba Linda earthquake. Shaking due to strong winds that occurred during the Encino earthquake dominates the frequency decrease, which correlates in time with the duration of the strong winds. The first shear wave recorded from the Encino and Yorba Linda earthquakes takes about 0.4 sec to travel up the 17-story building. ?? 2005, Earthquake Engineering Research Institute.
Engineered Proteins: Redox Properties and Their Applications
Prabhulkar, Shradha; Tian, Hui; Wang, Xiaotang; Zhu, Jun-Jie
2012-01-01
Abstract Oxidoreductases and metalloproteins, representing more than one third of all known proteins, serve as significant catalysts for numerous biological processes that involve electron transfers such as photosynthesis, respiration, metabolism, and molecular signaling. The functional properties of the oxidoreductases/metalloproteins are determined by the nature of their redox centers. Protein engineering is a powerful approach that is used to incorporate biological and abiological redox cofactors as well as novel enzymes and redox proteins with predictable structures and desirable functions for important biological and chemical applications. The methods of protein engineering, mainly rational design, directed evolution, protein surface modifications, and domain shuffling, have allowed the creation and study of a number of redox proteins. This review presents a selection of engineered redox proteins achieved through these methods, resulting in a manipulation in redox potentials, an increase in electron-transfer efficiency, and an expansion of native proteins by de novo design. Such engineered/modified redox proteins with desired properties have led to a broad spectrum of practical applications, ranging from biosensors, biofuel cells, to pharmaceuticals and hybrid catalysis. Glucose biosensors are one of the most successful products in enzyme electrochemistry, with reconstituted glucose oxidase achieving effective electrical communication with the sensor electrode; direct electron-transfer-type biofuel cells are developed to avoid thermodynamic loss and mediator leakage; and fusion proteins of P450s and redox partners make the biocatalytic generation of drug metabolites possible. In summary, this review includes the properties and applications of the engineered redox proteins as well as their significance and great potential in the exploration of bioelectrochemical sensing devices. Antioxid. Redox Signal. 17, 1796–1822. PMID:22435347
Response and recovery lessons from the 2010-2011 earthquake sequence in Canterbury, New Zealand
Pierepiekarz, Mark; Johnston, David; Berryman, Kelvin; Hare, John; Gomberg, Joan S.; Williams, Robert A.; Weaver, Craig S.
2014-01-01
The impacts and opportunities that result when low-probability moderate earthquakes strike an urban area similar to many throughout the US were vividly conveyed in a one-day workshop in which social and Earth scientists, public officials, engineers, and an emergency manager shared their experiences of the earthquake sequence that struck the city of Christchurch and surrounding Canterbury region of New Zealand in 2010-2011. Without question, the earthquake sequence has had unprecedented impacts in all spheres on New Zealand society, locally to nationally--10% of the country's population was directly impacted and losses total 8-10% of their GDP. The following paragraphs present a few lessons from Christchurch.
A Virtual Tour of the 1868 Hayward Earthquake in Google EarthTM
NASA Astrophysics Data System (ADS)
Lackey, H. G.; Blair, J. L.; Boatwright, J.; Brocher, T.
2007-12-01
The 1868 Hayward earthquake has been overshadowed by the subsequent 1906 San Francisco earthquake that destroyed much of San Francisco. Nonetheless, a modern recurrence of the 1868 earthquake would cause widespread damage to the densely populated Bay Area, particularly in the east Bay communities that have grown up virtually on top of the Hayward fault. Our concern is heightened by paleoseismic studies suggesting that the recurrence interval for the past five earthquakes on the southern Hayward fault is 140 to 170 years. Our objective is to build an educational web site that illustrates the cause and effect of the 1868 earthquake drawing on scientific and historic information. We will use Google EarthTM software to visually illustrate complex scientific concepts in a way that is understandable to a non-scientific audience. This web site will lead the viewer from a regional summary of the plate tectonics and faulting system of western North America, to more specific information about the 1868 Hayward earthquake itself. Text and Google EarthTM layers will include modeled shaking of the earthquake, relocations of historic photographs, reconstruction of damaged buildings as 3-D models, and additional scientific data that may come from the many scientific studies conducted for the 140th anniversary of the event. Earthquake engineering concerns will be stressed, including population density, vulnerable infrastructure, and lifelines. We will also present detailed maps of the Hayward fault, measurements of fault creep, and geologic evidence of its recurrence. Understanding the science behind earthquake hazards is an important step in preparing for the next significant earthquake. We hope to communicate to the public and students of all ages, through visualizations, not only the cause and effect of the 1868 earthquake, but also modern seismic hazards of the San Francisco Bay region.
Turkish Compulsory Earthquake Insurance and "Istanbul Earthquake
NASA Astrophysics Data System (ADS)
Durukal, E.; Sesetyan, K.; Erdik, M.
2009-04-01
The city of Istanbul will likely experience substantial direct and indirect losses as a result of a future large (M=7+) earthquake with an annual probability of occurrence of about 2%. This paper dwells on the expected building losses in terms of probable maximum and average annualized losses and discusses the results from the perspective of the compulsory earthquake insurance scheme operational in the country. The TCIP system is essentially designed to operate in Turkey with sufficient penetration to enable the accumulation of funds in the pool. Today, with only 20% national penetration, and about approximately one-half of all policies in highly earthquake prone areas (one-third in Istanbul) the system exhibits signs of adverse selection, inadequate premium structure and insufficient funding. Our findings indicate that the national compulsory earthquake insurance pool in Turkey will face difficulties in covering incurring building losses in Istanbul in the occurrence of a large earthquake. The annualized earthquake losses in Istanbul are between 140-300 million. Even if we assume that the deductible is raised to 15%, the earthquake losses that need to be paid after a large earthquake in Istanbul will be at about 2.5 Billion, somewhat above the current capacity of the TCIP. Thus, a modification to the system for the insured in Istanbul (or Marmara region) is necessary. This may mean an increase in the premia and deductible rates, purchase of larger re-insurance covers and development of a claim processing system. Also, to avoid adverse selection, the penetration rates elsewhere in Turkey need to be increased substantially. A better model would be introduction of parametric insurance for Istanbul. By such a model the losses will not be indemnified, however will be directly calculated on the basis of indexed ground motion levels and damages. The immediate improvement of a parametric insurance model over the existing one will be the elimination of the claim processing
GEM - The Global Earthquake Model
NASA Astrophysics Data System (ADS)
Smolka, A.
2009-04-01
Over 500,000 people died in the last decade due to earthquakes and tsunamis, mostly in the developing world, where the risk is increasing due to rapid population growth. In many seismic regions, no hazard and risk models exist, and even where models do exist, they are intelligible only by experts, or available only for commercial purposes. The Global Earthquake Model (GEM) answers the need for an openly accessible risk management tool. GEM is an internationally sanctioned public private partnership initiated by the Organisation for Economic Cooperation and Development (OECD) which will establish an authoritative standard for calculating and communicating earthquake hazard and risk, and will be designed to serve as the critical instrument to support decisions and actions that reduce earthquake losses worldwide. GEM will integrate developments on the forefront of scientific and engineering knowledge of earthquakes, at global, regional and local scale. The work is organized in three modules: hazard, risk, and socio-economic impact. The hazard module calculates probabilities of earthquake occurrence and resulting shaking at any given location. The risk module calculates fatalities, injuries, and damage based on expected shaking, building vulnerability, and the distribution of population and of exposed values and facilities. The socio-economic impact module delivers tools for making educated decisions to mitigate and manage risk. GEM will be a versatile online tool, with open source code and a map-based graphical interface. The underlying data will be open wherever possible, and its modular input and output will be adapted to multiple user groups: scientists and engineers, risk managers and decision makers in the public and private sectors, and the public-at- large. GEM will be the first global model for seismic risk assessment at a national and regional scale, and aims to achieve broad scientific participation and independence. Its development will occur in a
Earthquake potential revealed by tidal influence on earthquake size-frequency statistics
NASA Astrophysics Data System (ADS)
Ide, Satoshi; Yabe, Suguru; Tanaka, Yoshiyuki
2016-11-01
The possibility that tidal stress can trigger earthquakes is long debated. In particular, a clear causal relationship between small earthquakes and the phase of tidal stress is elusive. However, tectonic tremors deep within subduction zones are highly sensitive to tidal stress levels, with tremor rate increasing at an exponential rate with rising tidal stress. Thus, slow deformation and the possibility of earthquakes at subduction plate boundaries may be enhanced during periods of large tidal stress. Here we calculate the tidal stress history, and specifically the amplitude of tidal stress, on a fault plane in the two weeks before large earthquakes globally, based on data from the global, Japanese, and Californian earthquake catalogues. We find that very large earthquakes, including the 2004 Sumatran, 2010 Maule earthquake in Chile and the 2011 Tohoku-Oki earthquake in Japan, tend to occur near the time of maximum tidal stress amplitude. This tendency is not obvious for small earthquakes. However, we also find that the fraction of large earthquakes increases (the b-value of the Gutenberg-Richter relation decreases) as the amplitude of tidal shear stress increases. The relationship is also reasonable, considering the well-known relationship between stress and the b-value. This suggests that the probability of a tiny rock failure expanding to a gigantic rupture increases with increasing tidal stress levels. We conclude that large earthquakes are more probable during periods of high tidal stress.
NASA Astrophysics Data System (ADS)
Lapusta, N.
2011-12-01
Studying earthquake source processes is a multidisciplinary endeavor involving a number of subjects, from geophysics to engineering. As a solid mechanician interested in understanding earthquakes through physics-based computational modeling and comparison with observations, I need to educate and attract students from diverse areas. My CAREER award has provided the crucial support for the initiation of this effort. Applying for the award made me to go through careful initial planning in consultation with my colleagues and administration from two divisions, an important component of the eventual success of my path to tenure. Then, the long-term support directed at my program as a whole - and not a specific year-long task or subject area - allowed for the flexibility required for a start-up of a multidisciplinary undertaking. My research is directed towards formulating realistic fault models that incorporate state-of-the-art experimental studies, field observations, and analytical models. The goal is to compare the model response - in terms of long-term fault behavior that includes both sequences of simulated earthquakes and aseismic phenomena - with observations, to identify appropriate constitutive laws and parameter ranges. CAREER funding has enabled my group to develop a sophisticated 3D modeling approach that we have used to understand patterns of seismic and aseismic fault slip on the Sunda megathrust in Sumatra, investigate the effect of variable hydraulic properties on fault behavior, with application to Chi-Chi and Tohoku earthquake, create a model of the Parkfield segment of the San Andreas fault that reproduces both long-term and short-term features of the M6 earthquake sequence there, and design experiments with laboratory earthquakes, among several other studies. A critical ingredient in this research program has been the fully integrated educational component that allowed me, on the one hand, to expose students from different backgrounds to the
Using remote sensing to predict earthquake impacts
NASA Astrophysics Data System (ADS)
Fylaktos, Asimakis; Yfantidou, Anastasia
2017-09-01
Natural hazards like earthquakes can result to enormous property damage, and human casualties in mountainous areas. Italy has always been exposed to numerous earthquakes, mostly concentrated in central and southern regions. Last year, two seismic events near Norcia (central Italy) have occurred, which led to substantial loss of life and extensive damage to properties, infrastructure and cultural heritage. This research utilizes remote sensing products and GIS software, to provide a database of information. We used both SAR images of Sentinel 1A and optical imagery of Landsat 8 to examine the differences of topography with the aid of the multi temporal monitoring technique. This technique suits for the observation of any surface deformation. This database is a cluster of information regarding the consequences of the earthquakes in groups, such as property and infrastructure damage, regional rifts, cultivation loss, landslides and surface deformations amongst others, all mapped on GIS software. Relevant organizations can implement these data in order to calculate the financial impact of these types of earthquakes. In the future, we can enrich this database including more regions and enhance the variety of its applications. For instance, we could predict the future impacts of any type of earthquake in several areas, and design a preliminarily model of emergency for immediate evacuation and quick recovery response. It is important to know how the surface moves, in particular geographical regions like Italy, Cyprus and Greece, where earthquakes are so frequent. We are not able to predict earthquakes, but using data from this research, we may assess the damage that could be caused in the future.
An earthquake strength scale for the media and the public
Johnston, A.C.
1990-01-01
A local engineer, E.P Hailey, pointed this problem out to me shortly after the Loma Prieta earthquake. He felt that three problems limited the usefulness of magnitude in describing an earthquake to the public; (1) most people don't understand that it is not a linear scale; (2) of those who do realized the scale is not linear, very few understand the difference of a factor of ten in ground motion and 32 in energy release between points on the scale; and (3) even those who understand the first two points have trouble putting a given magnitude value into terms they can relate to. In summary, Mr. Hailey wondered why seismologists can't come up with an earthquake scale that doesn't confuse everyone and that conveys a sense of true relative size. Here, then, is m attempt to construct such a scale.
Geotechnical aspects of the January 2003 Tecoma'n, Mexico, earthquake
Wartman, Joseph; Rodriguez-Marek, Adrian; Macari, Emir J.; Deaton, Scott; Ramirez-Reynaga, Marti'n; Ochoa, Carlos N.; Callan, Sean; Keefer, David; Repetto, Pedro; Ovando-Shelley, Efrai'n
2005-01-01
Ground failure was the most prominent geotechnical engineering feature of the 21 January 2003 Mw 7.6 Tecoma´n earthquake. Ground failure impacted structures, industrial facilities, roads, water supply canals, and other critical infrastructure in the state of Colima and in parts of the neighboring states of Jalisco and Michoaca´n. Landslides and soil liquefaction were the most common type of ground failure, followed by seismic compression of unsaturated materials. Reinforced earth structures generally performed well during the earthquake, though some structures experienced permanent lateral deformations up to 10 cm. Different ground improvement techniques had been used to enhance the liquefaction resistance of several sites in the region, all of which performed well and exhibited no signs of damage or significant ground deformation. Earth dams in the region experienced some degree of permanent deformation but remained fully functional after the earthquake.
NASA Astrophysics Data System (ADS)
Lu, Kunquan; Hou, Meiying; Jiang, Zehui; Wang, Qiang; Sun, Gang; Liu, Jixing
2018-03-01
We treat the earth crust and mantle as large scale discrete matters based on the principles of granular physics and existing experimental observations. Main outcomes are: A granular model of the structure and movement of the earth crust and mantle is established. The formation mechanism of the tectonic forces, which causes the earthquake, and a model of propagation for precursory information are proposed. Properties of the seismic precursory information and its relevance with the earthquake occurrence are illustrated, and principle of ways to detect the effective seismic precursor is elaborated. The mechanism of deep-focus earthquake is also explained by the jamming-unjamming transition of the granular flow. Some earthquake phenomena which were previously difficult to understand are explained, and the predictability of the earthquake is discussed. Due to the discrete nature of the earth crust and mantle, the continuum theory no longer applies during the quasi-static seismological process. In this paper, based on the principles of granular physics, we study the causes of earthquakes, earthquake precursors and predictions, and a new understanding, different from the traditional seismological viewpoint, is obtained.
1979-01-01
OF SMALL INTERNAL COMBUSTION ENGINES AS A MEANS 0-.ETC(U) 1979 DAAK7O-78-C-O031 .hhuuufBuhhhh...Aerodyne Dallas th W__tIP FINAL REPORT CONTRACT* DAAK7-78-C-0031 FTURBOCHARGING OF SMALL INTERNAL COMBUSTION ENGINE AS A MEANS OF IMPROVING ENGINE ...DAAK70-78-C0031 TURBOCHARGING OF SMALL INTERNAL COMBUSTION ENGINES AS A MEANS OF IMPROVING ENGINE /APPLICATION SYSTEM FUEL ECONOMY Prepared by
Thermal and Environmental Barrier Coatings for Advanced Turbine Engine Applications
NASA Technical Reports Server (NTRS)
Zhu, Dong-Ming; Miller, Robert A.
2005-01-01
Ceramic thermal and environmental barrier coatings (T/EBCs) will play a crucial role in advanced gas turbine engine systems because of their ability to significantly increase engine operating temperatures and reduce cooling requirements, thus help achieve engine low emission and high efficiency goals. Advanced T/EBCs are being developed for the low emission SiC/SiC ceramic matrix composite (CMC) combustor applications by extending the CMC liner and vane temperature capability to 1650 C (3000 F) in oxidizing and water vapor containing combustion environments. Low conductivity thermal barrier coatings (TBCs) are also being developed for metallic turbine airfoil and combustor applications, providing the component temperature capability up to 1650 C (3000 F). In this paper, ceramic coating development considerations and requirements for both the ceramic and metallic components will be described for engine high temperature and high-heat-flux applications. The underlying coating failure mechanisms and life prediction approaches will be discussed based on the simulated engine tests and fracture mechanics modeling results.
Hotspots, Lifelines, and the Safrr Haywired Earthquake Sequence
NASA Astrophysics Data System (ADS)
Ratliff, J. L.; Porter, K.
2014-12-01
Though California has experienced many large earthquakes (San Francisco, 1906; Loma Prieta, 1989; Northridge, 1994), the San Francisco Bay Area has not had a damaging earthquake for 25 years. Earthquake risk and surging reliance on smartphones and the Internet to handle everyday tasks raise the question: is an increasingly technology-reliant Bay Area prepared for potential infrastructure impacts caused by a major earthquake? How will a major earthquake on the Hayward Fault affect lifelines (roads, power, water, communication, etc.)? The U.S. Geological Survey Science Application for Risk Reduction (SAFRR) program's Haywired disaster scenario, a hypothetical two-year earthquake sequence triggered by a M7.05 mainshock on the Hayward Fault, addresses these and other questions. We explore four geographic aspects of lifeline damage from earthquakes: (1) geographic lifeline concentrations, (2) areas where lifelines pass through high shaking or potential ground-failure zones, (3) areas with diminished lifeline service demand due to severe building damage, and (4) areas with increased lifeline service demand due to displaced residents and businesses. Potential mainshock lifeline vulnerability and spatial demand changes will be discerned by superimposing earthquake shaking, liquefaction probability, and landslide probability damage thresholds with lifeline concentrations and with large-capacity shelters. Intersecting high hazard levels and lifeline clusters represent potential lifeline susceptibility hotspots. We will also analyze possible temporal vulnerability and demand changes using an aftershock shaking threshold. The results of this analysis will inform regional lifeline resilience initiatives and response and recovery planning, as well as reveal potential redundancies and weaknesses for Bay Area lifelines. Identified spatial and temporal hotspots can provide stakeholders with a reference for possible systemic vulnerability resulting from an earthquake sequence.
Redefining Earthquakes and the Earthquake Machine
ERIC Educational Resources Information Center
Hubenthal, Michael; Braile, Larry; Taber, John
2008-01-01
The Earthquake Machine (EML), a mechanical model of stick-slip fault systems, can increase student engagement and facilitate opportunities to participate in the scientific process. This article introduces the EML model and an activity that challenges ninth-grade students' misconceptions about earthquakes. The activity emphasizes the role of models…
Brush Plating of Nickel-Tungsten Alloy for Engineering Application
2012-08-01
ASETS Defense ‘12 1 Brush Plating of Nickel-Tungsten Alloy for Engineering Application Zhimin Zhong & Sid Clouser Report Documentation Page Form...COVERED 00-00-2012 to 00-00-2012 4. TITLE AND SUBTITLE Brush Plating of Nickel-Tungsten Alloy for Engineering Application 5a. CONTRACT NUMBER...6 Surface morphology Visual appearance, scanning electron and optical microscope images. Smooth, fine grained, micro- cracked surface morphology
a Collaborative Cyberinfrastructure for Earthquake Seismology
NASA Astrophysics Data System (ADS)
Bossu, R.; Roussel, F.; Mazet-Roux, G.; Lefebvre, S.; Steed, R.
2013-12-01
One of the challenges in real time seismology is the prediction of earthquake's impact. It is particularly true for moderate earthquake (around magnitude 6) located close to urbanised areas, where the slightest uncertainty in event location, depth, magnitude estimates, and/or misevaluation of propagation characteristics, site effects and buildings vulnerability can dramatically change impact scenario. The Euro-Med Seismological Centre (EMSC) has developed a cyberinfrastructure to collect observations from eyewitnesses in order to provide in-situ constraints on actual damages. This cyberinfrastructure takes benefit of the natural convergence of earthquake's eyewitnesses on EMSC website (www.emsc-csem.org), the second global earthquake information website within tens of seconds of the occurrence of a felt event. It includes classical crowdsourcing tools such as online questionnaires available in 39 languages, and tools to collect geolocated pics. It also comprises information derived from the real time analysis of the traffic on EMSC website, a method named flashsourcing; In case of a felt earthquake, eyewitnesses reach EMSC website within tens of seconds to find out the cause of the shaking they have just been through. By analysing their geographical origin through their IP address, we automatically detect felt earthquakes and in some cases map the damaged areas through the loss of Internet visitors. We recently implemented a Quake Catcher Network (QCN) server in collaboration with Stanford University and the USGS, to collect ground motion records performed by volunteers and are also involved in a project to detect earthquakes from ground motions sensors from smartphones. Strategies have been developed for several social media (Facebook, Twitter...) not only to distribute earthquake information, but also to engage with the Citizens and optimise data collection. A smartphone application is currently under development. We will present an overview of this
NASA Astrophysics Data System (ADS)
Van Daele, Maarten; Araya-Cornejo, Cristian; Pille, Thomas; Meyer, Inka; Kempf, Philipp; Moernaut, Jasper; Cisternas, Marco
2017-04-01
triggered by megathrust earthquakes. These findings are an important step forward in the interpretation of lacustrine turbidites in subduction settings, and will eventually improve hazard assessments based on such paleoseismic records in the study area, and in other subduction zones. References Howarth et al., 2014. Lake sediments record high intensity shaking that provides insight into the location and rupture length of large earthquakes on the Alpine Fault, New Zealand. Earth and Planetary Science Letters 403, 340-351. Lomnitz, 1960. A study of the Maipo Valley earthquakes of September 4, 1958, Second World Conference on Earthquake Engineering, Tokyo and Kyoto, Japan, pp. 501-520. Sepulveda et al., 2008. New Findings on the 1958 Las Melosas Earthquake Sequence, Central Chile: Implications for Seismic Hazard Related to Shallow Crustal Earthquakes in Subduction Zones. Journal of Earthquake Engineering 12, 432-455. Van Daele et al., 2015. A comparison of the sedimentary records of the 1960 and 2010 great Chilean earthquakes in 17 lakes: Implications for quantitative lacustrine palaeoseismology. Sedimentology 62, 1466-1496.
USGS Training in Afghanistan: Modern Earthquake Hazards Assessments
NASA Astrophysics Data System (ADS)
Medlin, J. D.; Garthwaite, M.; Holzer, T.; McGarr, A.; Bohannon, R.; Bergen, K.; Vincent, T.
2007-05-01
Afghanistan is located in a tectonically active region where ongoing deformation has generated rugged mountainous terrain, and where large earthquakes occur frequently. These earthquakes can present a significant hazard, not only from strong ground shaking, but also from liquefaction and extensive land sliding. The magnitude 6.1 earthquake of March 25, 2002 highlighted the vulnerability of Afghanistan to such hazards, and resulted in over 1000 fatalities. The USGS has provided the first of a series of Earth Science training courses to the Afghan Geological Survey (AGS). This course was concerned with modern earthquake hazard assessments, and is an integral part of a larger USGS effort to provide a comprehensive seismic-hazard assessment for Afghanistan. Funding for these courses is provided by the US Agency for International Development Afghanistan Reconstruction Program. The particular focus of this training course, held December 2-6, 2006 in Kabul, was on providing a background in the seismological and geological methods relevant to preparing for future earthquakes. Topics included identifying active faults, modern tectonic theory, geotechnical measurements of near-surface materials, and strong-motion seismology. With this background, participants may now be expected to educate other members of the community and be actively involved in earthquake hazard assessments themselves. The December, 2006, training course was taught by four lecturers, with all lectures and slides being presented in English and translated into Dari. Copies of the lectures were provided to the students in both hardcopy and digital formats. Class participants included many of the section leaders from within the AGS who have backgrounds in geology, geophysics, and engineering. Two additional training sessions are planned for 2007, the first entitled "Modern Concepts in Geology and Mineral Resource Assessments," and the second entitled "Applied Geophysics for Mineral Resource Assessments."
Quantification of social contributions to earthquake mortality
NASA Astrophysics Data System (ADS)
Main, I. G.; NicBhloscaidh, M.; McCloskey, J.; Pelling, M.; Naylor, M.
2013-12-01
Death tolls in earthquakes, which continue to grow rapidly, are the result of complex interactions between physical effects, such as strong shaking, and the resilience of exposed populations and supporting critical infrastructures and institutions. While it is clear that the social context in which the earthquake occurs has a strong effect on the outcome, the influence of this context can only be exposed if we first decouple, as much as we can, the physical causes of mortality from our consideration. (Our modelling assumes that building resilience to shaking is a social factor governed by national wealth, legislation and enforcement and governance leading to reduced levels of corruption.) Here we attempt to remove these causes by statistically modelling published mortality, shaking intensity and population exposure data; unexplained variance from this physical model illuminates the contribution of socio-economic factors to increasing earthquake mortality. We find that this variance partitions countries in terms of basic socio-economic measures and allows the definition of a national vulnerability index identifying both anomalously resilient and anomalously vulnerable countries. In many cases resilience is well correlated with GDP; people in the richest countries are unsurprisingly safe from even the worst shaking. However some low-GDP countries rival even the richest in resilience, showing that relatively low cost interventions can have a positive impact on earthquake resilience and that social learning between these countries might facilitate resilience building in the absence of expensive engineering interventions.
OMG Earthquake! Can Twitter improve earthquake response?
NASA Astrophysics Data System (ADS)
Earle, P. S.; Guy, M.; Ostrum, C.; Horvath, S.; Buckmaster, R. A.
2009-12-01
The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public, text messages, can augment its earthquake response products and the delivery of hazard information. The goal is to gather near real-time, earthquake-related messages (tweets) and provide geo-located earthquake detections and rough maps of the corresponding felt areas. Twitter and other social Internet technologies are providing the general public with anecdotal earthquake hazard information before scientific information has been published from authoritative sources. People local to an event often publish information within seconds via these technologies. In contrast, depending on the location of the earthquake, scientific alerts take between 2 to 20 minutes. Examining the tweets following the March 30, 2009, M4.3 Morgan Hill earthquake shows it is possible (in some cases) to rapidly detect and map the felt area of an earthquake using Twitter responses. Within a minute of the earthquake, the frequency of “earthquake” tweets rose above the background level of less than 1 per hour to about 150 per minute. Using the tweets submitted in the first minute, a rough map of the felt area can be obtained by plotting the tweet locations. Mapping the tweets from the first six minutes shows observations extending from Monterey to Sacramento, similar to the perceived shaking region mapped by the USGS “Did You Feel It” system. The tweets submitted after the earthquake also provided (very) short first-impression narratives from people who experienced the shaking. Accurately assessing the potential and robustness of a Twitter-based system is difficult because only tweets spanning the previous seven days can be searched, making a historical study impossible. We have, however, been archiving tweets for several months, and it is clear that significant limitations do exist. The main drawback is the lack of quantitative information
NASA Astrophysics Data System (ADS)
Nealy, J. L.; Benz, H.; Hayes, G. P.; Bergman, E.; Barnhart, W. D.
2016-12-01
On February 21, 2008 at 14:16:02 (UTC), Wells, Nevada experienced a Mw 6.0 earthquake, the largest earthquake in the state within the past 50 years. Here, we re-analyze in detail the spatiotemporal variations of the foreshock and aftershock sequence and compare the distribution of seismicity to a recent slip model based on inversion of InSAR observations. A catalog of earthquakes for the time period of February 1, 2008 through August 31, 2008 was derived from a combination of arrival time picks using a kurtosis detector (primarily P arrival times), subspace detector (primarily S arrival times), associating the combined pick dataset, and applying multiple event relocation techniques using the 19 closest USArray Transportable Array stations, permanent regional seismic monitoring stations in Nevada and Utah, and temporary stations deployed for an aftershock study. We were able to detect several thousand earthquakes in the months following the mainshock as well as several foreshocks in the days leading up to the event. We reviewed the picks for the largest 986 earthquakes and relocated them using the Hypocentroidal Decomposition (HD) method. The HD technique provides both relative locations for the individual earthquakes and an absolute location for the earthquake cluster, resulting in absolute locations of the events in the cluster having minimal bias from unknown Earth structure. A subset of these "calibrated" earthquake locations that spanned the duration of the sequence and had small uncertainties in location were used as prior constraints within a second relocation effort using the entire dataset and the Bayesloc approach. Accurate locations (to within 2 km) were obtained using Bayesloc for 1,952 of the 2,157 events associated over the seven-month period of the study. The final catalog of earthquake hypocenters indicates that the aftershocks extend for about 20 km along the strike of the ruptured fault. The aftershocks occur primarily updip and along the
Tailored Carbon Nanotubes for Tissue Engineering Applications
Veetil, Jithesh V.; Ye, Kaiming
2008-01-01
A decade of aggressive researches on carbon nanotubes (CNTs) has paved way for extending these unique nanomaterials into a wide range of applications. In the relatively new arena of nanobiotechnology, a vast majority of applications are based on CNTs, ranging from miniaturized biosensors to organ regeneration. Nevertheless, the complexity of biological systems poses a significant challenge in developing CNT-based tissue engineering applications. This review focuses on the recent developments of CNT-based tissue engineering, where the interaction between living cells/tissues and the nanotubes have been transformed into a variety of novel techniques. This integration has already resulted in a revaluation of tissue engineering and organ regeneration techniques. Some of the new treatments that were not possible previously become reachable now. Because of the advent of surface chemistry, the CNT’s biocompatibility has been significantly improved, making it possible to serve as tissue scaffolding materials to enhance the organ regeneration. The superior mechanic strength and chemical inert also makes it ideal for blood compatible applications, especially for cardiopulmonary bypass surgery. The applications of CNTs in these cardiovascular surgeries led to a remarkable improvement in mechanical strength of implanted catheters and reduced thrombogenecity after surgery. Moreover, the functionalized CNTs have been extensively explored for in vivo targeted drug or gene delivery, which could potentially improve the efficiency of many cancer treatments. However, just like other nanomaterials, the cytotoxicity of CNTs has not been well established. Hence, more extensive cytotoxic studies are warranted while converting the hydrophobic CNTs into biocompatible nanomaterials. PMID:19496152
Stirling engine alternatives for the terrestrial solar application
NASA Technical Reports Server (NTRS)
Stearns, J.
1985-01-01
The first phase of the present study of Stirling engine alternatives for solar thermal-electric generation has been completed. Development risk levels are considered to be high for all engines evaluated. Free-piston type and Ringbom-type Stirling engine-alternators are not yet developed for the 25 to 50-kW electrical power range, although smaller machines have demonstrated the inherent robustness of the machines. Kinematic-type Stirling engines are presently achieving a 3500 hr lifetime or longer on critical components, and lifetime must still be further extended for the solar application. Operational and technical characteristics of all types of Stirling engines have been reviewed with engine developers. Technical work of merit in progress in each engine development organization should be recognized and supported in an appropriate manner.
NASA Astrophysics Data System (ADS)
Bhloscaidh, Mairead Nic; McCloskey, John; Pelling, Mark; Naylor, Mark
2013-04-01
Until expensive engineering solutions become more universally available, the objective targeting of resources at demonstrably effective, low-cost interventions might help reverse the trend of increasing mortality in earthquakes. Death tolls in earthquakes are the result of complex interactions between physical effects, such as the exposure of the population to strong shaking, and the resilience of the exposed population along with supporting critical infrastructures and institutions. The identification of socio-economic factors that contribute to earthquake mortality is crucial to identifying and developing successful risk management strategies. Here we develop a quantitative methodology more objectively to assess the ability of communities to withstand earthquake shaking, focusing on, in particular, those cases where risk management performance appears to exceed or fall below expectations based on economic status. Using only published estimates of the shaking intensity and population exposure for each earthquake, data that is available for earthquakes in countries irrespective of their level of economic development, we develop a model for mortality based on the contribution of population exposure to shaking only. This represents an attempt to remove, as far as possible, the physical causes of mortality from our analysis (where we consider earthquake engineering to reduce building collapse among the socio-economic influences). The systematic part of the variance with respect to this model can therefore be expected to be dominated by socio-economic factors. We find, as expected, that this purely physical analysis partitions countries in terms of basic socio-economic measures, for example GDP, focusing analytical attention on the power of economic measures to explain variance in observed distributions of earthquake risk. The model allows the definition of a vulnerability index which, although broadly it demonstrates the expected income-dependence of vulnerability to
Statistical validation of earthquake related observations
NASA Astrophysics Data System (ADS)
Kossobokov, V. G.
2011-12-01
optional "antipodal strategy", one can make the predictions efficient, so that the wins will systematically outscore the losses. Sounds easy, however, many precursor phenomena are lacking info on a rigorous control and, in many cases, even the necessary precondition of any scientific study, i.e., an unambiguous definition of "precursor/signal". On the other hand, understanding the complexity of seismic process along with its non-stationary hierarchically organized behaviors, has led already to reproducible intermediate-term middle-range earthquake prediction technique that has passed control test in forward real-time applications during at least the last two decades. In particular, place and time of each of the mega earthquakes of 27 February 2010 in Chile and 11 March 2011 in Japan were recognized as in state of increased probability of such events in advance their occurrences in the ongoing since 1992 Global Test of the algorithms M8 and MSc. These evidences, in conjunction with a retrospective analysis of seismic activity preceding 26 December 2004 in the Indian Ocean and other mega earthquakes of the 20th century, give grounds for assuming that the algorithms of validated effectiveness in magnitude ranges M7.5+ and M8.0+ are applicable to predict the mega-earthquakes as well.
Tissue engineering applications of therapeutic cloning.
Atala, Anthony; Koh, Chester J
2004-01-01
Few treatment options are available for patients suffering from diseased and injured organs because of a severe shortage of donor organs available for transplantation. Therapeutic cloning, where the nucleus from a donor cell is transferred into an enucleated oocyte in order to extract pluripotent embryonic stem cells, offers a potentially limitless source of cells for replacement therapy. Scientists in the field of tissue engineering apply the principles of cell transplantation, material science, and engineering to construct biological substitutes that will restore and maintain normal function in diseased and injured tissues. The present chapter reviews recent advances that have occurred in therapeutic cloning and tissue engineering and describes applications of these new technologies that may offer novel therapies for patients with end-stage organ failure.
Applications of CRISPR Genome Engineering in Cell Biology
Wang, Fangyuan; Qi, Lei S.
2016-01-01
Recent advances in genome engineering are starting a revolution in biological research and translational applications. The CRISPR-associated RNA-guided endonuclease Cas9 and its variants enable diverse manipulations of genome function. In this review, we describe the development of Cas9 tools for a variety of applications in cell biology research, including the study of functional genomics, the creation of transgenic animal models, and genomic imaging. Novel genome engineering methods offer a new avenue to understand the causality between genome and phenotype, thus promising a fuller understanding of cell biology. PMID:27599850
46 CFR 11.504 - Application of deck service for limited engineer endorsements.
Code of Federal Regulations, 2011 CFR
2011-10-01
... 46 Shipping 1 2011-10-01 2011-10-01 false Application of deck service for limited engineer... OFFICERS AND SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.504 Application of deck service for limited engineer endorsements. Service gained in the deck...
46 CFR 11.504 - Application of deck service for limited engineer endorsements.
Code of Federal Regulations, 2012 CFR
2012-10-01
... 46 Shipping 1 2012-10-01 2012-10-01 false Application of deck service for limited engineer... OFFICERS AND SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.504 Application of deck service for limited engineer endorsements. Service gained in the deck...
46 CFR 11.504 - Application of deck service for limited engineer endorsements.
Code of Federal Regulations, 2013 CFR
2013-10-01
... 46 Shipping 1 2013-10-01 2013-10-01 false Application of deck service for limited engineer... OFFICERS AND SEAMEN REQUIREMENTS FOR OFFICER ENDORSEMENTS Professional Requirements for Engineer Officer § 11.504 Application of deck service for limited engineer endorsements. Service gained in the deck...
Twitter earthquake detection: Earthquake monitoring in a social world
Earle, Paul S.; Bowden, Daniel C.; Guy, Michelle R.
2011-01-01
The U.S. Geological Survey (USGS) is investigating how the social networking site Twitter, a popular service for sending and receiving short, public text messages, can augment USGS earthquake response products and the delivery of hazard information. Rapid detection and qualitative assessment of shaking events are possible because people begin sending public Twitter messages (tweets) with in tens of seconds after feeling shaking. Here we present and evaluate an earthquake detection procedure that relies solely on Twitter data. A tweet-frequency time series constructed from tweets containing the word "earthquake" clearly shows large peaks correlated with the origin times of widely felt events. To identify possible earthquakes, we use a short-term-average, long-term-average algorithm. When tuned to a moderate sensitivity, the detector finds 48 globally-distributed earthquakes with only two false triggers in five months of data. The number of detections is small compared to the 5,175 earthquakes in the USGS global earthquake catalog for the same five-month time period, and no accurate location or magnitude can be assigned based on tweet data alone. However, Twitter earthquake detections are not without merit. The detections are generally caused by widely felt events that are of more immediate interest than those with no human impact. The detections are also fast; about 75% occur within two minutes of the origin time. This is considerably faster than seismographic detections in poorly instrumented regions of the world. The tweets triggering the detections also provided very short first-impression narratives from people who experienced the shaking.
First Results of the Regional Earthquake Likelihood Models Experiment
Schorlemmer, D.; Zechar, J.D.; Werner, M.J.; Field, E.H.; Jackson, D.D.; Jordan, T.H.
2010-01-01
The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment-a truly prospective earthquake prediction effort-is underway within the U. S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary-the forecasts were meant for an application of 5 years-we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one. ?? 2010 The Author(s).
Statistical distributions of earthquake numbers: consequence of branching process
NASA Astrophysics Data System (ADS)
Kagan, Yan Y.
2010-03-01
We discuss various statistical distributions of earthquake numbers. Previously, we derived several discrete distributions to describe earthquake numbers for the branching model of earthquake occurrence: these distributions are the Poisson, geometric, logarithmic and the negative binomial (NBD). The theoretical model is the `birth and immigration' population process. The first three distributions above can be considered special cases of the NBD. In particular, a point branching process along the magnitude (or log seismic moment) axis with independent events (immigrants) explains the magnitude/moment-frequency relation and the NBD of earthquake counts in large time/space windows, as well as the dependence of the NBD parameters on the magnitude threshold (magnitude of an earthquake catalogue completeness). We discuss applying these distributions, especially the NBD, to approximate event numbers in earthquake catalogues. There are many different representations of the NBD. Most can be traced either to the Pascal distribution or to the mixture of the Poisson distribution with the gamma law. We discuss advantages and drawbacks of both representations for statistical analysis of earthquake catalogues. We also consider applying the NBD to earthquake forecasts and describe the limits of the application for the given equations. In contrast to the one-parameter Poisson distribution so widely used to describe earthquake occurrence, the NBD has two parameters. The second parameter can be used to characterize clustering or overdispersion of a process. We determine the parameter values and their uncertainties for several local and global catalogues, and their subdivisions in various time intervals, magnitude thresholds, spatial windows, and tectonic categories. The theoretical model of how the clustering parameter depends on the corner (maximum) magnitude can be used to predict future earthquake number distribution in regions where very large earthquakes have not yet occurred.
First Results of the Regional Earthquake Likelihood Models Experiment
NASA Astrophysics Data System (ADS)
Schorlemmer, Danijel; Zechar, J. Douglas; Werner, Maximilian J.; Field, Edward H.; Jackson, David D.; Jordan, Thomas H.
2010-08-01
The ability to successfully predict the future behavior of a system is a strong indication that the system is well understood. Certainly many details of the earthquake system remain obscure, but several hypotheses related to earthquake occurrence and seismic hazard have been proffered, and predicting earthquake behavior is a worthy goal and demanded by society. Along these lines, one of the primary objectives of the Regional Earthquake Likelihood Models (RELM) working group was to formalize earthquake occurrence hypotheses in the form of prospective earthquake rate forecasts in California. RELM members, working in small research groups, developed more than a dozen 5-year forecasts; they also outlined a performance evaluation method and provided a conceptual description of a Testing Center in which to perform predictability experiments. Subsequently, researchers working within the Collaboratory for the Study of Earthquake Predictability (CSEP) have begun implementing Testing Centers in different locations worldwide, and the RELM predictability experiment—a truly prospective earthquake prediction effort—is underway within the U.S. branch of CSEP. The experiment, designed to compare time-invariant 5-year earthquake rate forecasts, is now approximately halfway to its completion. In this paper, we describe the models under evaluation and present, for the first time, preliminary results of this unique experiment. While these results are preliminary—the forecasts were meant for an application of 5 years—we find interesting results: most of the models are consistent with the observation and one model forecasts the distribution of earthquakes best. We discuss the observed sample of target earthquakes in the context of historical seismicity within the testing region, highlight potential pitfalls of the current tests, and suggest plans for future revisions to experiments such as this one.
NASA Astrophysics Data System (ADS)
Schaefer, Andreas; Daniell, James; Wenzel, Friedemann
2015-04-01
Earthquake forecasting and prediction has been one of the key struggles of modern geosciences for the last few decades. A large number of approaches for various time periods have been developed for different locations around the world. A categorization and review of more than 20 of new and old methods was undertaken to develop a state-of-the-art catalogue in forecasting algorithms and methodologies. The different methods have been categorised into time-independent, time-dependent and hybrid methods, from which the last group represents methods where additional data than just historical earthquake statistics have been used. It is necessary to categorize in such a way between pure statistical approaches where historical earthquake data represents the only direct data source and also between algorithms which incorporate further information e.g. spatial data of fault distributions or which incorporate physical models like static triggering to indicate future earthquakes. Furthermore, the location of application has been taken into account to identify methods which can be applied e.g. in active tectonic regions like California or in less active continental regions. In general, most of the methods cover well-known high-seismicity regions like Italy, Japan or California. Many more elements have been reviewed, including the application of established theories and methods e.g. for the determination of the completeness magnitude or whether the modified Omori law was used or not. Target temporal scales are identified as well as the publication history. All these different aspects have been reviewed and catalogued to provide an easy-to-use tool for the development of earthquake forecasting algorithms and to get an overview in the state-of-the-art.
Awareness and understanding of earthquake hazards at school
NASA Astrophysics Data System (ADS)
Saraò, Angela; Peruzza, Laura; Barnaba, Carla; Bragato, Pier Luigi
2014-05-01
Schools have a fundamental role in broadening the understanding of natural hazard and risks and in building the awareness in the community. Recent earthquakes in Italy and worldwide, have clearly demonstrated that the poor perception of seismic hazards diminishes the effectiveness of mitigation countermeasures. Since years the Seismology's department of OGS is involved in education projects and public activities to raise awareness about earthquakes. Working together with teachers we aim at developing age-appropriate curricula to improve the student's knowledge about earthquakes, seismic safety, and seismic risk reduction. Some examples of education activities we performed during the last years are here presented. We show our experience with the primary and intermediate schools where, through hands-on activities, we explain the earthquake phenomenon and its effects to kids, but we illustrate also some teaching interventions for high school students. During the past years we lectured classes, we led laboratory and field activities, and we organized summer stages for selected students. In the current year we are leading a project aimed at training high school students on seismic safety through a multidisciplinary approach that involves seismologists, engineers and experts of safety procedures. To combine the objective of dissemination of earthquake culture, also through the knowledge of the past seismicity, with that of a safety culture, we use innovative educational techniques and multimedia resources. Students and teachers, under the guidance of an expert seismologist, organize a combination of hands-on activities for understanding earthquakes in the lab through cheap tools and instrumentations At selected schools we provided the low cost seismometers of the QuakeCatcher network (http://qcn.stanford.edu) for recording earthquakes, and we trained teachers to use such instruments in the lab and to analyze recorded data. Within the same project we are going to train
Earthquake detection through computationally efficient similarity search
Yoon, Clara E.; O’Reilly, Ossian; Bergen, Karianne J.; Beroza, Gregory C.
2015-01-01
Seismology is experiencing rapid growth in the quantity of data, which has outpaced the development of processing algorithms. Earthquake detection—identification of seismic events in continuous data—is a fundamental operation for observational seismology. We developed an efficient method to detect earthquakes using waveform similarity that overcomes the disadvantages of existing detection methods. Our method, called Fingerprint And Similarity Thresholding (FAST), can analyze a week of continuous seismic waveform data in less than 2 hours, or 140 times faster than autocorrelation. FAST adapts a data mining algorithm, originally designed to identify similar audio clips within large databases; it first creates compact “fingerprints” of waveforms by extracting key discriminative features, then groups similar fingerprints together within a database to facilitate fast, scalable search for similar fingerprint pairs, and finally generates a list of earthquake detections. FAST detected most (21 of 24) cataloged earthquakes and 68 uncataloged earthquakes in 1 week of continuous data from a station located near the Calaveras Fault in central California, achieving detection performance comparable to that of autocorrelation, with some additional false detections. FAST is expected to realize its full potential when applied to extremely long duration data sets over a distributed network of seismic stations. The widespread application of FAST has the potential to aid in the discovery of unexpected seismic signals, improve seismic monitoring, and promote a greater understanding of a variety of earthquake processes. PMID:26665176
Engineering application of anaerobic ammonium oxidation process in wastewater treatment.
Mao, Nianjia; Ren, Hongqiang; Geng, Jinju; Ding, Lili; Xu, Ke
2017-08-01
Anaerobic ammonium oxidation (Anammox), a promising biological nitrogen removal process, has been verified as an efficient, sustainable and cost-effective alternative to conventional nitrification and denitrification processes. To date, more than 110 full-scale anammox plants have been installed and are in operation, treating industrial NH 4 + -rich wastewater worldwide, and anammox-based technologies are flourishing. This review the current state of the art for engineering applications of the anammox process, including various anammox-based technologies, reactor selection and attempts to apply it at different wastewater plants. Process control and implementation for stable performance are discussed as well as some remaining issues concerning engineering application are exposed, including the start-up period, process disturbances, greenhouse gas emissions and especially mainstream anammox applications. Finally, further development of the anammox engineering application is proposed in this review.
Biomaterial based cardiac tissue engineering and its applications
Huyer, Locke Davenport; Montgomery, Miles; Zhao, Yimu; Xiao, Yun; Conant, Genevieve; Korolj, Anastasia; Radisic, Milica
2015-01-01
Cardiovascular disease is a leading cause of death worldwide, necessitating the development of effective treatment strategies. A myocardial infarction involves the blockage of a coronary artery leading to depletion of nutrient and oxygen supply to cardiomyocytes and massive cell death in a region of the myocardium. Cardiac tissue engineering is the growth of functional cardiac tissue in vitro on biomaterial scaffolds for regenerative medicine application. This strategy relies on the optimization of the complex relationship between cell networks and biomaterial properties. In this review, we discuss important biomaterial properties for cardiac tissue engineering applications, such as elasticity, degradation, and induced host response, and their relationship to engineered cardiac cell environments. With these properties in mind, we also emphasize in vitro use of cardiac tissues for high-throughput drug screening and disease modelling. PMID:25989939
Crustal earthquake triggering by pre-historic great earthquakes on subduction zone thrusts
Sherrod, Brian; Gomberg, Joan
2014-01-01
Triggering of earthquakes on upper plate faults during and shortly after recent great (M>8.0) subduction thrust earthquakes raises concerns about earthquake triggering following Cascadia subduction zone earthquakes. Of particular regard to Cascadia was the previously noted, but only qualitatively identified, clustering of M>~6.5 crustal earthquakes in the Puget Sound region between about 1200–900 cal yr B.P. and the possibility that this was triggered by a great Cascadia thrust subduction thrust earthquake, and therefore portends future such clusters. We confirm quantitatively the extraordinary nature of the Puget Sound region crustal earthquake clustering between 1200–900 cal yr B.P., at least over the last 16,000. We conclude that this cluster was not triggered by the penultimate, and possibly full-margin, great Cascadia subduction thrust earthquake. However, we also show that the paleoseismic record for Cascadia is consistent with conclusions of our companion study of the global modern record outside Cascadia, that M>8.6 subduction thrust events have a high probability of triggering at least one or more M>~6.5 crustal earthquakes.
Development of Maximum Considered Earthquake Ground Motion Maps
Leyendecker, E.V.; Hunt, R.J.; Frankel, A.D.; Rukstales, K.S.
2000-01-01
The 1997 NEHRP Recommended Provisions for Seismic Regulations for New Buildings use a design procedure that is based on spectral response acceleration rather than the traditional peak ground acceleration, peak ground velocity, or zone factors. The spectral response accelerations are obtained from maps prepared following the recommendations of the Building Seismic Safety Council's (BSSC) Seismic Design Procedures Group (SDPG). The SDPG-recommended maps, the Maximum Considered Earthquake (MCE) Ground Motion Maps, are based on the U.S. Geological Survey (USGS) probabilistic hazard maps with additional modifications incorporating deterministic ground motions in selected areas and the application of engineering judgement. The MCE ground motion maps included with the 1997 NEHRP Provisions also serve as the basis for the ground motion maps used in the seismic design portions of the 2000 International Building Code and the 2000 International Residential Code. Additionally the design maps prepared for the 1997 NEHRP Provisions, combined with selected USGS probabilistic maps, are used with the 1997 NEHRP Guidelines for the Seismic Rehabilitation of Buildings.
Real-time earthquake monitoring: Early warning and rapid response
NASA Technical Reports Server (NTRS)
1991-01-01
A panel was established to investigate the subject of real-time earthquake monitoring (RTEM) and suggest recommendations on the feasibility of using a real-time earthquake warning system to mitigate earthquake damage in regions of the United States. The findings of the investigation and the related recommendations are described in this report. A brief review of existing real-time seismic systems is presented with particular emphasis given to the current California seismic networks. Specific applications of a real-time monitoring system are discussed along with issues related to system deployment and technical feasibility. In addition, several non-technical considerations are addressed including cost-benefit analysis, public perceptions, safety, and liability.
Zorlutuna, Pinar; Vrana, Nihal Engin; Khademhosseini, Ali
2013-01-01
The field of tissue engineering has been growing in the recent years as more products have made it to the market and as new uses for the engineered tissues have emerged, motivating many researchers to engage in this multidisciplinary field of research. Engineered tissues are now not only considered as end products for regenerative medicine, but also have emerged as enabling technologies for other fields of research ranging from drug discovery to biorobotics. This widespread use necessitates a variety of methodologies for production of tissue engineered constructs. In this review, these methods together with their non-clinical applications will be described. First, we will focus on novel materials used in tissue engineering scaffolds; such as recombinant proteins and synthetic, self assembling polypeptides. The recent advances in the modular tissue engineering area will be discussed. Then scaffold-free production methods, based on either cell sheets or cell aggregates will be described. Cell sources used in tissue engineering and new methods that provide improved control over cell behavior such as pathway engineering and biomimetic microenvironments for directing cell differentiation will be discussed. Finally, we will summarize the emerging uses of engineered constructs such as model tissues for drug discovery, cancer research and biorobotics applications. PMID:23268388
Gas and Dust Phenomena of Mega-earthquakes and the Cause
NASA Astrophysics Data System (ADS)
Yue, Z.
2013-12-01
dense natural (methane) gas suddenly escaped from deep crust traps along deep fault zones. References Yue, ZQ, 2009. The source of energy power directly causing the May 12 Wenchuan Earthquake: Huge extremely pressurized natural gases trapped in deep Longmen Shan faults. News Journal of China Society of Rock Mechanics and Engineering, 86 (2009 (2)), 45-50. Yue, ZQ, 2010. Features and mechanism of coseismic surface ruptures by Wenchuan Earthquake. in Rock Stress and Earthquake, edited by Furen Xie, Taylor & Francis Group, London, ISBN 978-0-415-60165-8, 761-768. Yue, ZQ, 2013a. Natural gas eruption mechanism for earthquake landslides: illustrated with comparison between Donghekou and Papandayan Rockslide-debris flows. in Earthquake-induced Landslides, K. Ugai et al. (eds.), Springer-Verlage Berlin, Chapter 51: pp. 485-494 Yue ZQ, 2013b. On incorrectness in elastic rebound theory for cause of earthquakes. Paper No. S20-003 of Session S20, Proceedings of the 13th International Conference on Fracture, June 16-21, Beijing. Yue ZQ, 2013c. On nature of earthquakes with cause of compressed methane gas expansion and migration in crustal rocks, in Proceedings of Fifth Biot Conference on Poromechanics in Memory of Karl von Terzaghi (1883-1963), July 10-12, Vienna, edited by C. Hellmich et al, @ASCE, pp. 507-516.
NASA Astrophysics Data System (ADS)
Kaneda, Y.; Kawaguchi, K.; Araki, E.; Matsumoto, H.; Nakamura, T.; Nakano, M.; Kamiya, S.; Ariyoshi, K.; Baba, T.; Ohori, M.; Hori, T.; Takahashi, N.; Kaneko, S.; Donet Research; Development Group
2010-12-01
Yoshiyuki Kaneda Katsuyoshi Kawaguchi*, Eiichiro Araki*, Shou Kaneko*, Hiroyuki Matsumoto*, Takeshi Nakamura*, Masaru Nakano*, Shinichirou Kamiya*, Keisuke Ariyoshi*, Toshitaka Baba*, Michihiro Ohori*, Narumi Takakahashi*, and Takane Hori** * Earthquake and Tsunami Research Project for Disaster Prevention, Leading Project , Japan Agency for Marine-Earth Science and Technology (JAMSTEC) **Institute for Research on Earth Evolution, Japan Agency for Marine-Earth Science and Technology (JAMSTEC) DONET (Dense Ocean Floor Network for Earthquakes and Tsunamis) is the real time monitoring system of the Tonankai seismogenic zones around the Nankai trough southwestern Japan. We were starting to develop DONET to perform real time monitoring of crustal activities over there and the advanced early warning system. DONET will provide important and useful data to understand the Nankai trough maga thrust earthquake seismogenic zones and to improve the accuracy of the earthquake recurrence cycle simulation. Details of DONET concept are as follows. 1) Redundancy, Extendable function and advanced maintenance system using the looped cable system, junction boxes and the ROV/AUV. DONET has 20 observatories and incorporated in a double land stations concept. Also, we are developed ROV for the 10km cable extensions and heavy weight operations. 2) Multi kinds of sensors to observe broad band phenomena such as long period tremors, very low frequency earthquakes and strong motions of mega thrust earthquakes over M8: Therefore, sensors such as a broadband seismometer, an accelerometer, a hydrophone, a precise pressure gauge, a differential pressure gauge and a thermometer are equipped with each observatory in DONET. 3) For speedy detections, evaluations and notifications of earthquakes and tsunamis: DONET system will be deployed around the Tonankai seismogenic zone. 4) Provide data of ocean floor crustal deformations derived from pressure sensors: Simultaneously, the development of data
Space shuttle main engine computed tomography applications
NASA Technical Reports Server (NTRS)
Sporny, Richard F.
1990-01-01
For the past two years the potential applications of computed tomography to the fabrication and overhaul of the Space Shuttle Main Engine were evaluated. Application tests were performed at various government and manufacturer facilities with equipment produced by four different manufacturers. The hardware scanned varied in size and complexity from a small temperature sensor and turbine blades to an assembled heat exchanger and main injector oxidizer inlet manifold. The evaluation of capabilities included the ability to identify and locate internal flaws, measure the depth of surface cracks, measure wall thickness, compare manifold design contours to actual part contours, perform automatic dimensional inspections, generate 3D computer models of actual parts, and image the relationship of the details in a complex assembly. The capabilities evaluated, with the exception of measuring the depth of surface flaws, demonstrated the existing and potential ability to perform many beneficial Space Shuttle Main Engine applications.
Earthquake recurrence and risk assessment in circum-Pacific seismic gaps
Thatcher, W.
1989-01-01
THE development of the concept of seismic gaps, regions of low earthquake activity where large events are expected, has been one of the notable achievements of seismology and plate tectonics. Its application to long-term earthquake hazard assessment continues to be an active field of seismological research. Here I have surveyed well documented case histories of repeated rupture of the same segment of circum-Pacific plate boundary and characterized their general features. I find that variability in fault slip and spatial extent of great earthquakes rupturing the same plate boundary segment is typical rather than exceptional but sequences of major events fill identified seismic gaps with remarkable order. Earthquakes are concentrated late in the seismic cycle and occur with increasing size and magnitude. Furthermore, earthquake rup-ture starts near zones of concentrated moment release, suggesting that high-slip regions control the timing of recurrent events. The absence of major earthquakes early in the seismic cycle indicates a more complex behaviour for lower-slip regions, which may explain the observed cycle-to-cycle diversity of gap-filling sequences. ?? 1989 Nature Publishing Group.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Türker, Tuğba, E-mail: tturker@ktu.edu.tr; Bayrak, Yusuf, E-mail: ybayrak@agri.edu.tr
North Anatolian Fault (NAF) is one from the most important strike-slip fault zones in the world and located among regions in the highest seismic activity. The NAFZ observed very large earthquakes from the past to present. The aim of this study; the important parameters of Gutenberg-Richter relationship (a and b values) estimated and this parameters taking into account, earthquakes were examined in the between years 1900-2015 for 10 different seismic source regions in the NAFZ. After that estimated occurrence probabilities and return periods of occurring earthquakes in fault zone in the next years, and is being assessed with Poisson methodmore » the earthquake hazard of the NAFZ. The Region 2 were observed the largest earthquakes for the only historical period and hasn’t been observed large earthquake for the instrumental period in this region. Two historical earthquakes (1766, M{sub S}=7.3 and 1897, M{sub S}=7.0) are included for Region 2 (Marmara Region) where a large earthquake is expected in the next years. The 10 different seismic source regions are determined the relationships between the cumulative number-magnitude which estimated a and b parameters with the equation of LogN=a-bM in the Gutenberg-Richter. A homogenous earthquake catalog for M{sub S} magnitude which is equal or larger than 4.0 is used for the time period between 1900 and 2015. The database of catalog used in the study has been created from International Seismological Center (ISC) and Boğazici University Kandilli observation and earthquake research institute (KOERI). The earthquake data were obtained until from 1900 to 1974 from KOERI and ISC until from 1974 to 2015 from KOERI. The probabilities of the earthquake occurring are estimated for the next 10, 20, 30, 40, 50, 60, 70, 80, 90 and 100 years in the 10 different seismic source regions. The highest earthquake occur probabilities in 10 different seismic source regions in the next years estimated that the region Tokat-Erzincan (Region 9
The Long-term Impacts of Earthquakes on Economic Growth
NASA Astrophysics Data System (ADS)
Lackner, S.
2016-12-01
The social science literature has so far not reached a consensus on whether and how earthquakes actually impact economic growth in the long-run. Several hypotheses have been suggested and some even argue for a positive impact. A general weakness in the literature, however, is the predominant use of inadequate measures for the exogenous natural hazard of an earthquake. The most common problems are the lack of individual event size (e.g. earthquake dummy or number of events), the use of magnitude instead of a measure for surface shaking, and endogeneity issues when traditional qualitative intensity scales or actual impact data is used. Here we use peak ground acceleration (PGA) as the ground motion intensity measure and investigate the impacts of earthquake shaking on long-run economic growth. We construct a data set from USGS ShakeMaps that can be considered the universe of global relevant earthquake ground shaking from 1973 to 2014. This data set is then combined with World Bank GDP data to conduct a regression analysis. Furthermore, the impacts of PGA on different industries and other economic variables such as employment and education are also investigated. This will on one hand help to identify the mechanism of how earthquakes impact long-run growth and also show potential impacts on other welfare indicators that are not captured by GDP. This is the first application of global earthquake shaking data to investigate long-term earthquake impacts.
An Earthquake Information Service with Free and Open Source Tools
NASA Astrophysics Data System (ADS)
Schroeder, M.; Stender, V.; Jüngling, S.
2015-12-01
At the GFZ German Research Centre for Geosciences in Potsdam, the working group Earthquakes and Volcano Physics examines the spatiotemporal behavior of earthquakes. In this context also the hazards of volcanic eruptions and tsunamis are explored. The aim is to collect related information after the occurrence of such extreme event and make them available for science and partly to the public as quickly as possible. However, the overall objective of this research is to reduce the geological risks that emanate from such natural hazards. In order to meet the stated objectives and to get a quick overview about the seismicity of a particular region and to compare the situation to historical events, a comprehensive visualization was desired. Based on the web-accessible data from the famous GFZ GEOFON network a user-friendly web mapping application was realized. Further, this web service integrates historical and current earthquake information from the USGS earthquake database, and more historical events from various other catalogues like Pacheco, International Seismological Centre (ISC) and more. This compilation of sources is unique in Earth sciences. Additionally, information about historical and current occurrences of volcanic eruptions and tsunamis are also retrievable. Another special feature in the application is the containment of times via a time shifting tool. Users can interactively vary the visualization by moving the time slider. Furthermore, the application was realized by using the newest JavaScript libraries which enables the application to run in all sizes of displays and devices. Our contribution will present the making of, the architecture behind, and few examples of the look and feel of this application.
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders; ...
2017-09-01
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Toward Exascale Earthquake Ground Motion Simulations for Near-Fault Engineering Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johansen, Hans; Rodgers, Arthur; Petersson, N. Anders
Modernizing SW4 for massively parallel time-domain simulations of earthquake ground motions in 3D earth models increases resolution and provides ground motion estimates for critical infrastructure risk evaluations. Simulations of ground motions from large (M ≥ 7.0) earthquakes require domains on the order of 100 to500 km and spatial granularity on the order of 1 to5 m resulting in hundreds of billions of grid points. Surface-focused structured mesh refinement (SMR) allows for more constant grid point per wavelength scaling in typical Earth models, where wavespeeds increase with depth. In fact, MR allows for simulations to double the frequency content relative tomore » a fixed grid calculation on a given resource. The authors report improvements to the SW4 algorithm developed while porting the code to the Cori Phase 2 (Intel Xeon Phi) systems at the National Energy Research Scientific Computing Center (NERSC) at Lawrence Berkeley National Laboratory. As a result, investigations of the performance of the innermost loop of the calculations found that reorganizing the order of operations can improve performance for massive problems.« less
Impact of earthquakes on sex ratio at birth: Eastern Marmara earthquakes
Doğer, Emek; Çakıroğlu, Yiğit; Köpük, Şule Yıldırım; Ceylan, Yasin; Şimşek, Hayal Uzelli; Çalışkan, Eray
2013-01-01
Objective: Previous reports suggest that maternal exposure to acute stress related to earthquakes affects the sex ratio at birth. Our aim was to examine the change in sex ratio at birth after Eastern Marmara earthquake disasters. Material and Methods: This study was performed using the official birth statistics from January 1997 to December 2002 – before and after 17 August 1999, the date of the Golcuk Earthquake – supplied from the Turkey Statistics Institute. The secondary sex ratio was expressed as the male proportion at birth, and the ratio of both affected and unaffected areas were calculated and compared on a monthly basis using data from gender with using the Chi-square test. Results: We observed significant decreases in the secondary sex ratio in the 4th and 8th months following an earthquake in the affected region compared to the unaffected region (p= 0.001 and p= 0.024). In the earthquake region, the decrease observed in the secondary sex ratio during the 8th month after an earthquake was specific to the period after the earthquake. Conclusion: Our study indicated a significant reduction in the secondary sex ratio after an earthquake. With these findings, events that cause sudden intense stress such as earthquakes can have an effect on the sex ratio at birth. PMID:24592082
Hazus® estimated annualized earthquake losses for the United States
Jaiswal, Kishor; Bausch, Doug; Rozelle, Jesse; Holub, John; McGowan, Sean
2017-01-01
Large earthquakes can cause social and economic disruption that can be unprecedented to any given community, and the full recovery from these impacts may or may not always be achievable. In the United States (U.S.), the 1994 M6.7 Northridge earthquake in California remains the third costliest disaster in U.S. history; and it was one of the most expensive disasters for the federal government. Internationally, earthquakes in the last decade alone have claimed tens of thousands of lives and caused hundreds of billions of dollars of economic impact throughout the globe (~90 billion U.S. dollars (USD) from 2008 M7.9 Wenchuan China, ~20 billion USD from 2010 M8.8 Maule earthquake in Chile, ~220 billion USD from 2011 M9.0 Tohoku Japan earthquake, ~25 billion USD from 2011 M6.3 Christchurch New Zealand, and ~22 billion USD from 2016 M7.0 Kumamoto Japan). Recent earthquakes show a pattern of steadily increasing damages and losses that are primarily due to three key factors: (1) significant growth in earthquake-prone urban areas, (2) vulnerability of the older building stock, including poorly engineered non-ductile concrete buildings, and (3) an increased interdependency in terms of supply and demand for the businesses that operate among different parts of the world. In the United States, earthquake risk continues to grow with increased exposure of population and development even though the earthquake hazard has remained relatively stable except for the regions of induced seismic activity. Understanding the seismic hazard requires studying earthquake characteristics and locales in which they occur, while understanding the risk requires an assessment of the potential damage from earthquake shaking to the built environment and to the welfare of people—especially in high-risk areas. Estimating the varying degree of earthquake risk throughout the United States is critical for informed decision-making on mitigation policies, priorities, strategies, and funding levels in the
Large-Scale Earthquake Countermeasures Act and the Earthquake Prediction Council in Japan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rikitake, T.
1979-08-07
The Large-Scale Earthquake Countermeasures Act was enacted in Japan in December 1978. This act aims at mitigating earthquake hazards by designating an area to be an area under intensified measures against earthquake disaster, such designation being based on long-term earthquake prediction information, and by issuing an earthquake warnings statement based on imminent prediction information, when possible. In an emergency case as defined by the law, the prime minister will be empowered to take various actions which cannot be taken at ordinary times. For instance, he may ask the Self-Defense Force to come into the earthquake-threatened area before the earthquake occurrence.more » A Prediction Council has been formed in order to evaluate premonitory effects that might be observed over the Tokai area, which was designated an area under intensified measures against earthquake disaster some time in June 1979. An extremely dense observation network has been constructed over the area.« less
Application of a time-magnitude prediction model for earthquakes
NASA Astrophysics Data System (ADS)
An, Weiping; Jin, Xueshen; Yang, Jialiang; Dong, Peng; Zhao, Jun; Zhang, He
2007-06-01
In this paper we discuss the physical meaning of the magnitude-time model parameters for earthquake prediction. The gestation process for strong earthquake in all eleven seismic zones in China can be described by the magnitude-time prediction model using the computations of the parameters of the model. The average model parameter values for China are: b = 0.383, c=0.154, d = 0.035, B = 0.844, C = -0.209, and D = 0.188. The robustness of the model parameters is estimated from the variation in the minimum magnitude of the transformed data, the spatial extent, and the temporal period. Analysis of the spatial and temporal suitability of the model indicates that the computation unit size should be at least 4° × 4° for seismic zones in North China, at least 3° × 3° in Southwest and Northwest China, and the time period should be as long as possible.
PCL-Based Composite Scaffold Matrices for Tissue Engineering Applications.
Siddiqui, Nadeem; Asawa, Simran; Birru, Bhaskar; Baadhe, Ramaraju; Rao, Sreenivasa
2018-05-14
Biomaterial-based scaffolds are important cues in tissue engineering (TE) applications. Recent advances in TE have led to the development of suitable scaffold architecture for various tissue defects. In this narrative review on polycaprolactone (PCL), we have discussed in detail about the synthesis of PCL, various properties and most recent advances of using PCL and PCL blended with either natural or synthetic polymers and ceramic materials for TE applications. Further, various forms of PCL scaffolds such as porous, films and fibrous have been discussed along with the stem cells and their sources employed in various tissue repair strategies. Overall, the present review affords an insight into the properties and applications of PCL in various tissue engineering applications.
NASA Astrophysics Data System (ADS)
Weiser, Deborah Anne
Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.
Foreshocks, aftershocks, and earthquake probabilities: Accounting for the landers earthquake
Jones, Lucile M.
1994-01-01
The equation to determine the probability that an earthquake occurring near a major fault will be a foreshock to a mainshock on that fault is modified to include the case of aftershocks to a previous earthquake occurring near the fault. The addition of aftershocks to the background seismicity makes its less probable that an earthquake will be a foreshock, because nonforeshocks have become more common. As the aftershocks decay with time, the probability that an earthquake will be a foreshock increases. However, fault interactions between the first mainshock and the major fault can increase the long-term probability of a characteristic earthquake on that fault, which will, in turn, increase the probability that an event is a foreshock, compensating for the decrease caused by the aftershocks.
Application of a long-range forecasting model to earthquakes in the Japan mainland testing region
NASA Astrophysics Data System (ADS)
Rhoades, David A.
2011-03-01
The Every Earthquake a Precursor According to Scale (EEPAS) model is a long-range forecasting method which has been previously applied to a number of regions, including Japan. The Collaboratory for the Study of Earthquake Predictability (CSEP) forecasting experiment in Japan provides an opportunity to test the model at lower magnitudes than previously and to compare it with other competing models. The model sums contributions to the rate density from past earthquakes based on predictive scaling relations derived from the precursory scale increase phenomenon. Two features of the earthquake catalogue in the Japan mainland region create difficulties in applying the model, namely magnitude-dependence in the proportion of aftershocks and in the Gutenberg-Richter b-value. To accommodate these features, the model was fitted separately to earthquakes in three different target magnitude classes over the period 2000-2009. There are some substantial unexplained differences in parameters between classes, but the time and magnitude distributions of the individual earthquake contributions are such that the model is suitable for three-month testing at M ≥ 4 and for one-year testing at M ≥ 5. In retrospective analyses, the mean probability gain of the EEPAS model over a spatially smoothed seismicity model increases with magnitude. The same trend is expected in prospective testing. The Proximity to Past Earthquakes (PPE) model has been submitted to the same testing classes as the EEPAS model. Its role is that of a spatially-smoothed reference model, against which the performance of time-varying models can be compared.
Optimal Solution for an Engineering Applications Using Modified Artificial Immune System
NASA Astrophysics Data System (ADS)
Padmanabhan, S.; Chandrasekaran, M.; Ganesan, S.; patan, Mahamed Naveed Khan; Navakanth, Polina
2017-03-01
An Engineering optimization leads a essential role in several engineering application areas like process design, product design, re-engineering and new product development, etc. In engineering, an awfully best answer is achieved by comparison to some completely different solutions by utilization previous downside information. An optimization algorithms provide systematic associate degreed economical ways that within which of constructing and comparison new design solutions so on understand at best vogue, thus on best solution efficiency and acquire the foremost wonderful design impact. In this paper, a new evolutionary based Modified Artificial Immune System (MAIS) algorithm used to optimize an engineering application of gear drive design. The results are compared with existing design.
Earthquake prediction evaluation standards applied to the VAN Method
NASA Astrophysics Data System (ADS)
Jackson, David D.
Earthquake prediction research must meet certain standards before it can be suitably evaluated for potential application in decision making. For methods that result in a binary (on or off) alarm condition, requirements include (1) a quantitative description of observables that trigger an alarm, (2) a quantitative description, including ranges of time, location, and magnitude, of the predicted earthquakes, (3) documented evidence of all previous alarms, (4) a complete list of predicted earthquakes, (5) a complete list of unpredicted earthquakes. The VAN technique [Varotsos and Lazaridou, 1991; Varotsos et al., 1996] has not yet been stated as a testable hypothesis. It fails criteria (1) and (2) so it is not ready to be evaluated properly. Although telegrams were transmitted in advance of claimed successes, these telegrams did not fully specify the predicted events, and all of the published statistical evaluations involve many subjective ex post facto decisions. Lacking a statistically demonstrated relationship to earthquakes, a candidate prediction technique should satisfy several plausibility criteria, including: (1) a reasonable relationship between the location of the candidate precursor and that of the predicted earthquake, (2) some demonstration that the candidate precursory observations are related to stress, strain, or other quantities related to earthquakes, and (3) the existence of co-seismic as well as pre-seismic variations of the candidate precursor. The VAN technique meets none of these criteria.
Earthquake precursors: activation or quiescence?
NASA Astrophysics Data System (ADS)
Rundle, John B.; Holliday, James R.; Yoder, Mark; Sachs, Michael K.; Donnellan, Andrea; Turcotte, Donald L.; Tiampo, Kristy F.; Klein, William; Kellogg, Louise H.
2011-10-01
We discuss the long-standing question of whether the probability for large earthquake occurrence (magnitudes m > 6.0) is highest during time periods of smaller event activation, or highest during time periods of smaller event quiescence. The physics of the activation model are based on an idea from the theory of nucleation, that a small magnitude earthquake has a finite probability of growing into a large earthquake. The physics of the quiescence model is based on the idea that the occurrence of smaller earthquakes (here considered as magnitudes m > 3.5) may be due to a mechanism such as critical slowing down, in which fluctuations in systems with long-range interactions tend to be suppressed prior to large nucleation events. To illuminate this question, we construct two end-member forecast models illustrating, respectively, activation and quiescence. The activation model assumes only that activation can occur, either via aftershock nucleation or triggering, but expresses no choice as to which mechanism is preferred. Both of these models are in fact a means of filtering the seismicity time-series to compute probabilities. Using 25 yr of data from the California-Nevada catalogue of earthquakes, we show that of the two models, activation and quiescence, the latter appears to be the better model, as judged by backtesting (by a slight but not significant margin). We then examine simulation data from a topologically realistic earthquake model for California seismicity, Virtual California. This model includes not only earthquakes produced from increases in stress on the fault system, but also background and off-fault seismicity produced by a BASS-ETAS driving mechanism. Applying the activation and quiescence forecast models to the simulated data, we come to the opposite conclusion. Here, the activation forecast model is preferred to the quiescence model, presumably due to the fact that the BASS component of the model is essentially a model for activated seismicity. These
Wheeler, Russell L.
2014-01-01
Computation of probabilistic earthquake hazard requires an estimate of Mmax: the moment magnitude of the largest earthquake that is thought to be possible within a specified geographic region. The region specified in this report is the Central and Eastern United States and adjacent Canada. Parts A and B of this report describe the construction of a global catalog of moderate to large earthquakes that occurred worldwide in tectonic analogs of the Central and Eastern United States. Examination of histograms of the magnitudes of these earthquakes allows estimation of Central and Eastern United States Mmax. The catalog and Mmax estimates derived from it are used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. Part A deals with prehistoric earthquakes, and this part deals with historical events.
Bose, Maren; Graves, Robert; Gill, David; Callaghan, Scott; Maechling, Phillip J.
2014-01-01
Real-time applications such as earthquake early warning (EEW) typically use empirical ground-motion prediction equations (GMPEs) along with event magnitude and source-to-site distances to estimate expected shaking levels. In this simplified approach, effects due to finite-fault geometry, directivity and site and basin response are often generalized, which may lead to a significant under- or overestimation of shaking from large earthquakes (M > 6.5) in some locations. For enhanced site-specific ground-motion predictions considering 3-D wave-propagation effects, we develop support vector regression (SVR) models from the SCEC CyberShake low-frequency (<0.5 Hz) and broad-band (0–10 Hz) data sets. CyberShake encompasses 3-D wave-propagation simulations of >415 000 finite-fault rupture scenarios (6.5 ≤ M ≤ 8.5) for southern California defined in UCERF 2.0. We use CyberShake to demonstrate the application of synthetic waveform data to EEW as a ‘proof of concept’, being aware that these simulations are not yet fully validated and might not appropriately sample the range of rupture uncertainty. Our regression models predict the maximum and the temporal evolution of instrumental intensity (MMI) at 71 selected test sites using only the hypocentre, magnitude and rupture ratio, which characterizes uni- and bilateral rupture propagation. Our regression approach is completely data-driven (where here the CyberShake simulations are considered data) and does not enforce pre-defined functional forms or dependencies among input parameters. The models were established from a subset (∼20 per cent) of CyberShake simulations, but can explain MMI values of all >400 k rupture scenarios with a standard deviation of about 0.4 intensity units. We apply our models to determine threshold magnitudes (and warning times) for various active faults in southern California that earthquakes need to exceed to cause at least ‘moderate’, ‘strong’ or ‘very strong’ shaking
Earthquake and tsunami forecasts: Relation of slow slip events to subsequent earthquake rupture
Dixon, Timothy H.; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor
2014-01-01
The 5 September 2012 Mw 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr–Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential. PMID:25404327
Earthquake and tsunami forecasts: relation of slow slip events to subsequent earthquake rupture.
Dixon, Timothy H; Jiang, Yan; Malservisi, Rocco; McCaffrey, Robert; Voss, Nicholas; Protti, Marino; Gonzalez, Victor
2014-12-02
The 5 September 2012 M(w) 7.6 earthquake on the Costa Rica subduction plate boundary followed a 62-y interseismic period. High-precision GPS recorded numerous slow slip events (SSEs) in the decade leading up to the earthquake, both up-dip and down-dip of seismic rupture. Deeper SSEs were larger than shallower ones and, if characteristic of the interseismic period, release most locking down-dip of the earthquake, limiting down-dip rupture and earthquake magnitude. Shallower SSEs were smaller, accounting for some but not all interseismic locking. One SSE occurred several months before the earthquake, but changes in Mohr-Coulomb failure stress were probably too small to trigger the earthquake. Because many SSEs have occurred without subsequent rupture, their individual predictive value is limited, but taken together they released a significant amount of accumulated interseismic strain before the earthquake, effectively defining the area of subsequent seismic rupture (rupture did not occur where slow slip was common). Because earthquake magnitude depends on rupture area, this has important implications for earthquake hazard assessment. Specifically, if this behavior is representative of future earthquake cycles and other subduction zones, it implies that monitoring SSEs, including shallow up-dip events that lie offshore, could lead to accurate forecasts of earthquake magnitude and tsunami potential.
Understanding earthquake hazards in urban areas - Evansville Area Earthquake Hazards Mapping Project
Boyd, Oliver S.
2012-01-01
The region surrounding Evansville, Indiana, has experienced minor damage from earthquakes several times in the past 200 years. Because of this history and the proximity of Evansville to the Wabash Valley and New Madrid seismic zones, there is concern among nearby communities about hazards from earthquakes. Earthquakes currently cannot be predicted, but scientists can estimate how strongly the ground is likely to shake as a result of an earthquake and are able to design structures to withstand this estimated ground shaking. Earthquake-hazard maps provide one way of conveying such information and can help the region of Evansville prepare for future earthquakes and reduce earthquake-caused loss of life and financial and structural loss. The Evansville Area Earthquake Hazards Mapping Project (EAEHMP) has produced three types of hazard maps for the Evansville area: (1) probabilistic seismic-hazard maps show the ground motion that is expected to be exceeded with a given probability within a given period of time; (2) scenario ground-shaking maps show the expected shaking from two specific scenario earthquakes; (3) liquefaction-potential maps show how likely the strong ground shaking from the scenario earthquakes is to produce liquefaction. These maps complement the U.S. Geological Survey's National Seismic Hazard Maps but are more detailed regionally and take into account surficial geology, soil thickness, and soil stiffness; these elements greatly affect ground shaking.
Vacuum plasma spray applications on liquid fuel rocket engines
NASA Technical Reports Server (NTRS)
Mckechnie, T. N.; Zimmerman, F. R.; Bryant, M. A.
1992-01-01
The vacuum plasma spray process (VPS) has been developed by NASA and Rocketdyne for a variety of applications on liquid fuel rocket engines, including the Space Shuttle Main Engine. These applications encompass thermal barrier coatings which are thermal shock resistant for turbopump blades and nozzles; bond coatings for cryogenic titanium components; wear resistant coatings and materials; high conductivity copper, NaRloy-Z, combustion chamber liners, and structural nickel base material, Inconel 718, for nozzle and combustion chamber support jackets.
"Did you feel it?" Intensity data: A surprisingly good measure of earthquake ground motion
Atkinson, G.M.; Wald, D.J.
2007-01-01
The U.S. Geological Survey is tapping a vast new source of engineering seismology data through its "Did You Feel It?" (DYFI) program, which collects online citizen responses to earthquakes. To date, more than 750,000 responses have been compiled in the United States alone. The DYFI data make up in quantity what they may lack in scientific quality and offer the potential to resolve longstanding issues in earthquake ground-motion science. Such issues have been difficult to address due to the paucity of instrumental ground-motion data in regions of low seismicity. In particular, DYFI data provide strong evidence that earthquake stress drops, which control the strength of high-frequency ground shaking, are higher in the central and eastern United States (CEUS) than in California. Higher earthquake stress drops, coupled with lower attenuation of shaking with distance, result in stronger overall shaking over a wider area and thus more potential damage for CEUS earthquakes in comparison to those of equal magnitude in California - a fact also definitively captured with these new DYFI data and maps.
NASA Astrophysics Data System (ADS)
So, E.
2010-12-01
Earthquake casualty loss estimation, which depends primarily on building-specific casualty rates, has long suffered from a lack of cross-disciplinary collaboration in post-earthquake data gathering. An increase in our understanding of what contributes to casualties in earthquakes involve coordinated data-gathering efforts amongst disciplines; these are essential for improved global casualty estimation models. It is evident from examining past casualty loss models and reviewing field data collected from recent events, that generalized casualty rates cannot be applied globally for different building types, even within individual countries. For a particular structure type, regional and topographic building design effects, combined with variable material and workmanship quality all contribute to this multi-variant outcome. In addition, social factors affect building-specific casualty rates, including social status and education levels, and human behaviors in general, in that they modify egress and survivability rates. Without considering complex physical pathways, loss models purely based on historic casualty data, or even worse, rates derived from other countries, will be of very limited value. What’s more, as the world’s population, housing stock, and living and cultural environments change, methods of loss modeling must accommodate these variables, especially when considering casualties. To truly take advantage of observed earthquake losses, not only do damage surveys need better coordination of international and national reconnaissance teams, but these teams must integrate difference areas of expertise including engineering, public health and medicine. Research is needed to find methods to achieve consistent and practical ways of collecting and modeling casualties in earthquakes. International collaboration will also be necessary to transfer such expertise and resources to the communities in the cities which most need it. Coupling the theories and findings from
Engineering mechanical microenvironment of macrophage and its biomedical applications.
Li, Jing; Li, Yuhui; Gao, Bin; Qin, Chuanguang; He, Yining; Xu, Feng; Yang, Hui; Lin, Min
2018-03-01
Macrophages are the most plastic cells in the hematopoietic system and can be widely found in almost all tissues. Recently studies have shown that mechanical cues (e.g., matrix stiffness and stress/strain) can significantly affect macrophage behaviors. Although existing reviews on the physical and mechanical cues that regulate the macrophage's phenotype are available, engineering mechanical microenvironment of macrophages in vitro as well as a comprehensive overview and prospects for their biomedical applications (e.g., tissue engineering and immunotherapy) has yet to be summarized. Thus, this review provides an overview on the existing methods for engineering mechanical microenvironment of macrophages in vitro and then a section on their biomedical applications and further perspectives are presented.
Composite Material Application to Liquid Rocket Engines
NASA Technical Reports Server (NTRS)
Judd, D. C.
1982-01-01
The substitution of reinforced plastic composite (RPC) materials for metal was studied. The major objectives were to: (1) determine the extent to which composite materials can be beneficially used in liquid rocket engines; (2) identify additional technology requirements; and (3) determine those areas which have the greatest potential for return. Weight savings, fabrication costs, performance, life, and maintainability factors were considered. Two baseline designs, representative of Earth to orbit and orbit to orbit engine systems, were selected. Weight savings are found to be possible for selected components with the substitution of materials for metal. Various technology needs are identified before RPC material can be used in rocket engine applications.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-13
... Engineering Corporation; Notice of Successive Preliminary Permit Application Accepted for Filing and Soliciting Comments, Motions To Intervene, and Competing Applications On August 20, 2013, Albany Engineering Corporation (Albany Engineering) filed an application for a successive preliminary permit, pursuant to section...
Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake
Hayes, Gavin P.; Herman, Matthew W.; Barnhart, William D.; Furlong, Kevin P.; Riquelme, Sebástian; Benz, Harley M.; Bergman, Eric; Barrientos, Sergio; Earle, Paul S.; Samsonov, Sergey
2014-01-01
The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile which had not ruptured in a megathrust earthquake since a M ~8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March–April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.
Continuing megathrust earthquake potential in Chile after the 2014 Iquique earthquake.
Hayes, Gavin P; Herman, Matthew W; Barnhart, William D; Furlong, Kevin P; Riquelme, Sebástian; Benz, Harley M; Bergman, Eric; Barrientos, Sergio; Earle, Paul S; Samsonov, Sergey
2014-08-21
The seismic gap theory identifies regions of elevated hazard based on a lack of recent seismicity in comparison with other portions of a fault. It has successfully explained past earthquakes (see, for example, ref. 2) and is useful for qualitatively describing where large earthquakes might occur. A large earthquake had been expected in the subduction zone adjacent to northern Chile, which had not ruptured in a megathrust earthquake since a M ∼8.8 event in 1877. On 1 April 2014 a M 8.2 earthquake occurred within this seismic gap. Here we present an assessment of the seismotectonics of the March-April 2014 Iquique sequence, including analyses of earthquake relocations, moment tensors, finite fault models, moment deficit calculations and cumulative Coulomb stress transfer. This ensemble of information allows us to place the sequence within the context of regional seismicity and to identify areas of remaining and/or elevated hazard. Our results constrain the size and spatial extent of rupture, and indicate that this was not the earthquake that had been anticipated. Significant sections of the northern Chile subduction zone have not ruptured in almost 150 years, so it is likely that future megathrust earthquakes will occur to the south and potentially to the north of the 2014 Iquique sequence.
76 FR 46769 - Applications for New Awards; Minority Science and Engineering Improvement Program
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-03
... DEPARTMENT OF EDUCATION Applications for New Awards; Minority Science and Engineering Improvement... Information: Minority Science and Engineering Improvement Program (MSEIP) Notice inviting applications for new... effect long-range improvement in science and engineering education at predominantly minority institutions...
Earthquakes: Predicting the unpredictable?
Hough, Susan E.
2005-01-01
The earthquake prediction pendulum has swung from optimism in the 1970s to rather extreme pessimism in the 1990s. Earlier work revealed evidence of possible earthquake precursors: physical changes in the planet that signal that a large earthquake is on the way. Some respected earthquake scientists argued that earthquakes are likewise fundamentally unpredictable. The fate of the Parkfield prediction experiment appeared to support their arguments: A moderate earthquake had been predicted along a specified segment of the central San Andreas fault within five years of 1988, but had failed to materialize on schedule. At some point, however, the pendulum began to swing back. Reputable scientists began using the "P-word" in not only polite company, but also at meetings and even in print. If the optimism regarding earthquake prediction can be attributed to any single cause, it might be scientists' burgeoning understanding of the earthquake cycle.
Thermal barrier coatings application in diesel engines
NASA Technical Reports Server (NTRS)
Fairbanks, J. W.
1995-01-01
Commercial use of thermal barrier coatings in diesel engines began in the mid 70's by Dr,. Ingard Kvernes at the Central Institute for Industrial Research in Oslo, Norway. Dr. Kvernes attributed attack on diesel engine valves and piston crowns encountered in marine diesel engines in Norwegian ships as hot-corrosion attributed to a reduced quality of residual fuel. His solution was to coat these components to reduce metal temperature below the threshold of aggressive hot-corrosion and also to provide protection. The Department of Energy has supported thermal barrier coating development for diesel engine applications. In the Clean Diesel - 50 Percent Efficient (CD-50) engine for the year 2000, thermal barrier coatings will be used on piston crowns and possibly other components. The primary purpose of the thermal barrier coatings will be to reduce thermal fatigue as the engine peak cylinder pressure will nearly be doubled. As the coatings result in higher available energy in the exhaust gas, efficiency gains are achieved through use of this energy by turbochargers, turbocompounding or thermoelectric generators.
Building information modelling review with potential applications in tunnel engineering of China.
Zhou, Weihong; Qin, Haiyang; Qiu, Junling; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-08-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance.
Building information modelling review with potential applications in tunnel engineering of China
Zhou, Weihong; Qin, Haiyang; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-01-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance. PMID:28878970
Building information modelling review with potential applications in tunnel engineering of China
NASA Astrophysics Data System (ADS)
Zhou, Weihong; Qin, Haiyang; Qiu, Junling; Fan, Haobo; Lai, Jinxing; Wang, Ke; Wang, Lixin
2017-08-01
Building information modelling (BIM) can be applied to tunnel engineering to address a number of problems, including complex structure, extensive design, long construction cycle and increased security risks. To promote the development of tunnel engineering in China, this paper combines actual cases, including the Xingu mountain tunnel and the Shigu Mountain tunnel, to systematically analyse BIM applications in tunnel engineering in China. The results indicate that BIM technology in tunnel engineering is currently mainly applied during the design stage rather than during construction and operation stages. The application of BIM technology in tunnel engineering covers many problems, such as a lack of standards, incompatibility of different software, disorganized management, complex combination with GIS (Geographic Information System), low utilization rate and poor awareness. In this study, through summary of related research results and engineering cases, suggestions are introduced and an outlook for the BIM application in tunnel engineering in China is presented, which provides guidance for design optimization, construction standards and later operation maintenance.
Comparison of two large earthquakes: the 2008 Sichuan Earthquake and the 2011 East Japan Earthquake.
Otani, Yuki; Ando, Takayuki; Atobe, Kaori; Haiden, Akina; Kao, Sheng-Yuan; Saito, Kohei; Shimanuki, Marie; Yoshimoto, Norifumi; Fukunaga, Koichi
2012-01-01
Between August 15th and 19th, 2011, eight 5th-year medical students from the Keio University School of Medicine had the opportunity to visit the Peking University School of Medicine and hold a discussion session titled "What is the most effective way to educate people for survival in an acute disaster situation (before the mental health care stage)?" During the session, we discussed the following six points: basic information regarding the Sichuan Earthquake and the East Japan Earthquake, differences in preparedness for earthquakes, government actions, acceptance of medical rescue teams, earthquake-induced secondary effects, and media restrictions. Although comparison of the two earthquakes was not simple, we concluded that three major points should be emphasized to facilitate the most effective course of disaster planning and action. First, all relevant agencies should formulate emergency plans and should supply information regarding the emergency to the general public and health professionals on a normal basis. Second, each citizen should be educated and trained in how to minimize the risks from earthquake-induced secondary effects. Finally, the central government should establish a single headquarters responsible for command, control, and coordination during a natural disaster emergency and should centralize all powers in this single authority. We hope this discussion may be of some use in future natural disasters in China, Japan, and worldwide.
Chapter A. The Loma Prieta, California, Earthquake of October 17, 1989 - Lifelines
Schiff, Anshel J.
1998-01-01
To the general public who had their televisions tuned to watch the World Series, the 1989 Loma Prieta earthquake was a lifelines earthquake. It was the images seen around the world of the collapsed Cypress Street viaduct, with the frantic and heroic efforts to pull survivors from the structure that was billowing smoke; the collapsed section of the San Francisco-Oakland Bay Bridge and subsequent home video of a car plunging off the open span; and the spectacular fire in the Marina District of San Francisco fed by a broken gasline. To many of the residents of the San Francisco Bay region, the relation of lifelines to the earthquake was characterized by sitting in the dark because of power outage, the inability to make telephone calls because of network congestion, and the slow and snarled traffic. Had the public been aware of the actions of the engineers and tradespeople working for the utilities and other lifeline organizations on the emergency response and restoration of lifelines, the lifeline characteristics of this earthquake would have been even more significant. Unobserved by the public were the warlike devastation in several electrical-power substations, the 13 miles of gas-distribution lines that had to be replaced in several communities, and the more than 1,200 leaks and breaks in water mains and service connections that had to be excavated and repaired. Like the 1971 San Fernando, Calif., earthquake, which was a seminal event for activity to improve the earthquake performance of lifelines, the 1989 Loma Prieta earthquake demonstrated that the tasks of preparing lifelines in 'earthquake country' were incomplete-indeed, new lessons had to be learned.
Earthquake Intensity and Strong Motion Analysis Within SEISCOMP3
NASA Astrophysics Data System (ADS)
Becker, J.; Weber, B.; Ghasemi, H.; Cummins, P. R.; Murjaya, J.; Rudyanto, A.; Rößler, D.
2017-12-01
Measuring and predicting ground motion parameters including seismic intensities for earthquakes is crucial and subject to recent research in engineering seismology.gempa has developed the new SIGMA module for Seismic Intensity and Ground Motion Analysis. The module is based on the SeisComP3 framework extending it in the field of seismic hazard assessment and engineering seismology. SIGMA may work with or independently of SeisComP3 by supporting FDSN Web services for importing earthquake or station information and waveforms. It provides a user-friendly and modern graphical interface for semi-automatic and interactive strong motion data processing. SIGMA provides intensity and (P)SA maps based on GMPE's or recorded data. It calculates the most common strong motion parameters, e.g. PGA/PGV/PGD, Arias intensity and duration, Tp, Tm, CAV, SED and Fourier-, power- and response spectra. GMPE's are configurable. Supporting C++ and Python plug-ins, standard and customized GMPE's including the OpenQuake Hazard Library can be easily integrated and compared. Originally tailored to specifications by Geoscience Australia and BMKG (Indonesia) SIGMA has become a popular tool among SeisComP3 users concerned with seismic hazard and strong motion seismology.
PAGER-CAT: A composite earthquake catalog for calibrating global fatality models
Allen, T.I.; Marano, K.D.; Earle, P.S.; Wald, D.J.
2009-01-01
highly uncertain, particularly the casualty numbers, which must be regarded as estimates rather than firm numbers for many earthquakes. Consequently, we encourage contributions from the seismology and earthquake engineering communities to further improve this resource via the Wikipedia page and personal communications, for the benefit of the whole community.
Virtual Collaborative Environments for System of Systems Engineering and Applications for ISAT
NASA Technical Reports Server (NTRS)
Dryer, David A.
2002-01-01
This paper describes an system of systems or metasystems approach and models developed to help prepare engineering organizations for distributed engineering environments. These changes in engineering enterprises include competition in increasingly global environments; new partnering opportunities caused by advances in information and communication technologies, and virtual collaboration issues associated with dispersed teams. To help address challenges and needs in this environment, a framework is proposed that can be customized and adapted for NASA to assist in improved engineering activities conducted in distributed, enhanced engineering environments. The approach is designed to prepare engineers for such distributed collaborative environments by learning and applying e-engineering methods and tools to a real-world engineering development scenario. The approach consists of two phases: an e-engineering basics phase and e-engineering application phase. The e-engineering basics phase addresses skills required for e-engineering. The e-engineering application phase applies these skills in a distributed collaborative environment to system development projects.
Application of near-infrared image processing in agricultural engineering
NASA Astrophysics Data System (ADS)
Chen, Ming-hong; Zhang, Guo-ping; Xia, Hongxing
2009-07-01
Recently, with development of computer technology, the application field of near-infrared image processing becomes much wider. In this paper the technical characteristic and development of modern NIR imaging and NIR spectroscopy analysis were introduced. It is concluded application and studying of the NIR imaging processing technique in the agricultural engineering in recent years, base on the application principle and developing characteristic of near-infrared image. The NIR imaging would be very useful in the nondestructive external and internal quality inspecting of agricultural products. It is important to detect stored-grain insects by the application of near-infrared spectroscopy. Computer vision detection base on the NIR imaging would be help to manage food logistics. Application of NIR imaging promoted quality management of agricultural products. In the further application research fields of NIR image in the agricultural engineering, Some advices and prospect were put forward.
Optimal-adaptive filters for modelling spectral shape, site amplification, and source scaling
Safak, Erdal
1989-01-01
This paper introduces some applications of optimal filtering techniques to earthquake engineering by using the so-called ARMAX models. Three applications are presented: (a) spectral modelling of ground accelerations, (b) site amplification (i.e., the relationship between two records obtained at different sites during an earthquake), and (c) source scaling (i.e., the relationship between two records obtained at a site during two different earthquakes). A numerical example for each application is presented by using recorded ground motions. The results show that the optimal filtering techniques provide elegant solutions to above problems, and can be a useful tool in earthquake engineering.
The Application of Tissue Engineering Procedures to Repair the Larynx
ERIC Educational Resources Information Center
Ringel, Robert L.; Kahane, Joel C.; Hillsamer, Peter J.; Lee, Annie S.; Badylak, Stephen F.
2006-01-01
The field of tissue engineering/regenerative medicine combines the quantitative principles of engineering with the principles of the life sciences toward the goal of reconstituting structurally and functionally normal tissues and organs. There has been relatively little application of tissue engineering efforts toward the organs of speech, voice,…
NASA Astrophysics Data System (ADS)
Motosaka, M.
2009-12-01
This paper presents firstly, the development of an integrated regional earthquake early warning (EEW) system having on-line structural health monitoring (SHM) function, in Miyagi prefecture, Japan. The system makes it possible to provide more accurate, reliable and immediate earthquake information for society by combining the national (JMA/NIED) EEW system, based on advanced real-time communication technology. The author has planned to install the EEW/SHM system to the public buildings around Sendai, a million city of north-eastern Japan. The system has been so far implemented in two buildings; one is in Sendai, and the other in Oshika, a front site on the Pacific Ocean coast for the approaching Miyagi-ken Oki earthquake. The data from the front-site and the on-site are processed by the analysis system which was installed at the analysis center of Disaster Control Research Center, Tohoku University. The real-time earthquake information from JMA is also received at the analysis center. The utilization of the integrated EEW/SHM system is addressed together with future perspectives. Examples of the obtained data are also described including the amplitude depending dynamic characteristics of the building in Sendai before, during, and after the 2008/6/14 Iwate-Miyagi Nairiku Earthquake, together with the historical change of dynamic characteristics for 40 years. Secondary, this paper presents an advanced methodology based on Artificial Neural Networks (ANN) for forward forecasting of ground motion parameters, not only PGA, PGV, but also Spectral information before S-wave arrival using initial part of P-waveform at a front site. The estimated ground motion information can be used as warning alarm for earthquake damage reduction. The Fourier Amplitude Spectra (FAS) estimated before strong shaking with high accuracy can be used for advanced engineering applications, e.g. feed-forward structural control of a building of interest. The validity and applicability of the method
Micro-/nano-engineered cellular responses for soft tissue engineering and biomedical applications.
Tay, Chor Yong; Irvine, Scott Alexander; Boey, Freddy Y C; Tan, Lay Poh; Venkatraman, Subbu
2011-05-23
The development of biomedical devices and reconstruction of functional ex vivo tissues often requires the need to fabricate biomimetic surfaces with features of sub-micrometer precision. This can be achieved with the advancements in micro-/nano-engineering techniques, allowing researchers to manipulate a plethora of cellular behaviors at the cell-biomaterial interface. Systematic studies conducted on these 2D engineered surfaces have unraveled numerous novel findings that can potentially be integrated as part of the design consideration for future 2D and 3D biomaterials and will no doubt greatly benefit tissue engineering. In this review, recent developments detailing the use of micro-/nano-engineering techniques to direct cellular orientation and function pertinent to soft tissue engineering will be highlighted. Particularly, this article aims to provide valuable insights into distinctive cell interactions and reactions to controlled surfaces, which can be exploited to understand the mechanisms of cell growth on micro-/nano-engineered interfaces, and to harness this knowledge to optimize the performance of 3D artificial soft tissue grafts and biomedical applications. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
1964 Great Alaska Earthquake: a photographic tour of Anchorage, Alaska
Thoms, Evan E.; Haeussler, Peter J.; Anderson, Rebecca D.; McGimsey, Robert G.
2014-01-01
, and small-scale maps, as well as links to slideshows of additional photographs and Google Street View™ scenes. Buildings in Anchorage that were severely damaged, sites of major landslides, and locations of post-earthquake engineering responses are highlighted. The web map can be used online as a virtual tour or in a physical self-guided tour using a web-enabled Global Positioning System (GPS) device. This publication serves the purpose of committing most of the content of the web map to a single distributable document. As such, some of the content differs from the online version.
Polymer, metal and ceramic matrix composites for advanced aircraft engine applications
NASA Technical Reports Server (NTRS)
Mcdanels, D. L.; Serafini, T. T.; Dicarlo, J. A.
1985-01-01
Advanced aircraft engine research within NASA Lewis is being focused on propulsion systems for subsonic, supersonic, and hypersonic aircraft. Each of these flight regimes requires different types of engines, but all require advanced materials to meet their goals of performance, thrust-to-weight ratio, and fuel efficiency. The high strength/weight and stiffness/weight properties of resin, metal, and ceramic matrix composites will play an increasingly key role in meeting these performance requirements. At NASA Lewis, research is ongoing to apply graphite/polyimide composites to engine components and to develop polymer matrices with higher operating temperature capabilities. Metal matrix composites, using magnesium, aluminum, titanium, and superalloy matrices, are being developed for application to static and rotating engine components, as well as for space applications, over a broad temperature range. Ceramic matrix composites are also being examined to increase the toughness and reliability of ceramics for application to high-temperature engine structures and components.
Wald, David J.
2010-01-01
This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey’s (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER’s overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra–Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability.
System Engineering of Photonic Systems for Space Application
NASA Technical Reports Server (NTRS)
Watson, Michael D.; Pryor, Jonathan E.
2014-01-01
The application of photonics in space systems requires tight integration with the spacecraft systems to ensure accurate operation. This requires some detailed and specific system engineering to properly incorporate the photonics into the spacecraft architecture and to guide the spacecraft architecture in supporting the photonics devices. Recent research in product focused, elegant system engineering has led to a system approach which provides a robust approach to this integration. Focusing on the mission application and the integration of the spacecraft system physics incorporation of the photonics can be efficiently and effectively accomplished. This requires a clear understanding of the driving physics properties of the photonics device to ensure proper integration with no unintended consequences. The driving physics considerations in terms of optical performance will be identified for their use in system integration. Keywords: System Engineering, Optical Transfer Function, Optical Physics, Photonics, Image Jitter, Launch Vehicle, System Integration, Organizational Interaction
Earthquakes; January-February 1982
Person, W.J.
1982-01-01
In the United States, a number of earthquakes occurred, but only minor damage was reported. Arkansas experienced a swarm of earthquakes beginning on January 12. Canada experienced one of its strongest earthquakes in a number of years on January 9; this earthquake caused slight damage in Maine.
Lee, Ya-Ting; Turcotte, Donald L; Holliday, James R; Sachs, Michael K; Rundle, John B; Chen, Chien-Chih; Tiampo, Kristy F
2011-10-04
The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M ≥ 4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M ≥ 4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor-Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most "successful" in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts.
Lee, Ya-Ting; Turcotte, Donald L.; Holliday, James R.; Sachs, Michael K.; Rundle, John B.; Chen, Chien-Chih; Tiampo, Kristy F.
2011-01-01
The Regional Earthquake Likelihood Models (RELM) test of earthquake forecasts in California was the first competitive evaluation of forecasts of future earthquake occurrence. Participants submitted expected probabilities of occurrence of M≥4.95 earthquakes in 0.1° × 0.1° cells for the period 1 January 1, 2006, to December 31, 2010. Probabilities were submitted for 7,682 cells in California and adjacent regions. During this period, 31 M≥4.95 earthquakes occurred in the test region. These earthquakes occurred in 22 test cells. This seismic activity was dominated by earthquakes associated with the M = 7.2, April 4, 2010, El Mayor–Cucapah earthquake in northern Mexico. This earthquake occurred in the test region, and 16 of the other 30 earthquakes in the test region could be associated with it. Nine complete forecasts were submitted by six participants. In this paper, we present the forecasts in a way that allows the reader to evaluate which forecast is the most “successful” in terms of the locations of future earthquakes. We conclude that the RELM test was a success and suggest ways in which the results can be used to improve future forecasts. PMID:21949355
NASA Astrophysics Data System (ADS)
Xiong, N.; Niu, F.
2017-12-01
A Mw 7.8 earthquake struck Gorkha, Nepal, on April 5, 2015, resulting in more than 8000 deaths and 3.5 million homeless. The earthquake initiated 70km west of Kathmandu and propagated eastward, rupturing an area of approximately 150km by 60km in size. However, the earthquake failed to fully rupture the locked fault beneath the Himalaya, suggesting that the region south of Kathmandu and west of the current rupture are still locked and a much more powerful earthquake might occur in future. Therefore, the seismic hazard of the unruptured region is of great concern. In this study, we investigated the Coulomb failure stress (CFS) accumulation on the unruptured fault transferred by the Gorkha earthquake and some nearby historical great earthquakes. First, we calculated the co-seismic CFS changes of the Gorkha earthquake on the nodal planes of 16 large aftershocks to quantitatively examine whether they were brought closer to failure by the mainshock. It is shown that at least 12 of the 16 aftershocks were encouraged by an increase of CFS of 0.1-3 MPa. The correspondence between the distribution of off-fault aftershocks and the increased CFS pattern also validates the applicability of the earthquake triggering hypothesis in the thrust regime of Nepal. With the validation as confidence, we calculated the co-seismic CFS change on the locked region imparted by the Gorkha earthquake and historical great earthquakes. A newly proposed ramp-flat-ramp-flat fault geometry model was employed, and the source parameters of historical earthquakes were computed with the empirical scaling relationship. A broad region south of the Kathmandu and west of the current rupture were shown to be positively stressed with CFS change roughly ranging between 0.01 and 0.5 MPa. The maximum of CFS increase (>1MPa) was found in the updip segment south of the current rupture, implying a high seismic hazard. Since the locked region may be additionally stressed by the post-seismic relaxation of the lower
NASA Astrophysics Data System (ADS)
Hoshiba, M.; Ogiso, M.
2016-12-01
Sequence of the 2016 Kumamoto earthquakes (Mw6.2 on April 14, Mw7.0 on April 16, and many aftershocks) caused a devastating damage at Kumamoto and Oita prefectures, Japan. During the Mw7.0 event, just after the direct S waves passing the central Oita, another M6 class event occurred there more than 80 km apart from the Mw7.0 event. The M6 event is interpreted as an induced earthquake; but it brought stronger shaking at the central Oita than that from the Mw7.0 event. We will discuss the induced earthquake from viewpoint of Earthquake Early Warning. In terms of ground shaking such as PGA and PGV, the Mw7.0 event is much smaller than those of the M6 induced earthquake at the central Oita (for example, 1/8 smaller at OIT009 station for PGA), and then it is easy to discriminate two events. However, PGD of the Mw7.0 is larger than that of the induced earthquake, and its appearance is just before the occurrence of the induced earthquake. It is quite difficult to recognize the induced earthquake from displacement waveforms only, because the displacement is strongly contaminated by that of the preceding Mw7.0 event. In many methods of EEW (including current JMA EEW system), magnitude is used for prediction of ground shaking through Ground Motion Prediction Equation (GMPE) and the magnitude is often estimated from displacement. However, displacement magnitude does not necessarily mean the best one for prediction of ground shaking, such as PGA and PGV. In case of the induced earthquake during the Kumamoto earthquake, displacement magnitude could not be estimated because of the strong contamination. Actually JMA EEW system could not recognize the induced earthquake. One of the important lessons we learned from eight years' operation of EEW is an issue of the multiple simultaneous earthquakes, such as aftershocks of the 2011 Mw9.0 Tohoku earthquake. Based on this lesson, we have proposed enhancement of real-time monitor of ground shaking itself instead of rapid estimation of
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir G.; Nekrasova, Anastasia K.
2018-05-01
We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on morphostructural analysis, pattern recognition, and the Unified Scaling Law for Earthquakes (USLE), which generalizes the Gutenberg-Richter relationship making use of naturally fractal distribution of earthquake sources of different size in a seismic region. The USLE stands for an empirical relationship log10 N(M, L) = A + B·(5 - M) + C·log10 L, where N(M, L) is the expected annual number of earthquakes of a certain magnitude M within a seismically prone area of linear dimension L. We use parameters A, B, and C of USLE to estimate, first, the expected maximum magnitude in a time interval at seismically prone nodes of the morphostructural scheme of the region under study, then map the corresponding expected ground shaking parameters (e.g., peak ground acceleration, PGA, or macro-seismic intensity). After a rigorous verification against the available seismic evidences in the past (usually, the observed instrumental PGA or the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks for population, cities, and infrastructures (e.g., those based on census of population, buildings inventory). The methodology of seismic hazard and risk assessment is illustrated by application to the territory of Greater Caucasus and Crimea.
U.S. Seismic Design Maps Web Application
NASA Astrophysics Data System (ADS)
Martinez, E.; Fee, J.
2015-12-01
The application computes earthquake ground motion design parameters compatible with the International Building Code and other seismic design provisions. It is the primary method for design engineers to obtain ground motion parameters for multiple building codes across the country. When designing new buildings and other structures, engineers around the country use the application. Users specify the design code of interest, location, and other parameters to obtain necessary ground motion information consisting of a high-level executive summary as well as detailed information including maps, data, and graphs. Results are formatted such that they can be directly included in a final engineering report. In addition to single-site analysis, the application supports a batch mode for simultaneous consideration of multiple locations. Finally, an application programming interface (API) is available which allows other application developers to integrate this application's results into larger applications for additional processing. Development on the application has proceeded in an iterative manner working with engineers through email, meetings, and workshops. Each iteration provided new features, improved performance, and usability enhancements. This development approach positioned the application to be integral to the structural design process and is now used to produce over 1800 reports daily. Recent efforts have enhanced the application to be a data-driven, mobile-first, responsive web application. Development is ongoing, and source code has recently been published into the open-source community on GitHub. Open-sourcing the code facilitates improved incorporation of user feedback to add new features ensuring the application's continued success.
Geomagnetically induced currents: Science, engineering, and applications readiness
NASA Astrophysics Data System (ADS)
Pulkkinen, A.; Bernabeu, E.; Thomson, A.; Viljanen, A.; Pirjola, R.; Boteler, D.; Eichner, J.; Cilliers, P. J.; Welling, D.; Savani, N. P.; Weigel, R. S.; Love, J. J.; Balch, C.; Ngwira, C. M.; Crowley, G.; Schultz, A.; Kataoka, R.; Anderson, B.; Fugate, D.; Simpson, J. J.; MacAlester, M.
2017-07-01
This paper is the primary deliverable of the very first NASA Living With a Star Institute Working Group, Geomagnetically Induced Currents (GIC) Working Group. The paper provides a broad overview of the current status and future challenges pertaining to the science, engineering, and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allows improved understanding and physics-based modeling of the physical processes behind GIC. Engineering, in turn, is understood here as the "impact" aspect of GIC. Applications are understood as the models, tools, and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government agencies for managing any potential consequences from GIC impact to critical infrastructure. Applications can be considered the ultimate goal of our GIC work. In assessing the status of the field, we quantify the readiness of various applications in the mitigation context. We use the Applications Readiness Level (ARL) concept to carry out the quantification.
Geomagnetically induced currents: Science, engineering, and applications readiness
Pulkkinen, Antti; Bernabeu, E.; Thomson, A.; Viljanen, A.; Pirjola, R.; Boteler, D.; Eichner, J.; Cilliers, P.J.; Welling, D.; Savani, N.P.; Weigel, R.S.; Love, Jeffrey J.; Balch, Christopher; Ngwira, C.M.; Crowley, G.; Schultz, Adam; Kataoka, R.; Anderson, B.; Fugate, D.; Simpson, J.J.; MacAlester, M.
2017-01-01
This paper is the primary deliverable of the very first NASA Living With a Star Institute Working Group, Geomagnetically Induced Currents (GIC) Working Group. The paper provides a broad overview of the current status and future challenges pertaining to the science, engineering, and applications of the GIC problem. Science is understood here as the basic space and Earth sciences research that allows improved understanding and physics-based modeling of the physical processes behind GIC. Engineering, in turn, is understood here as the “impact” aspect of GIC. Applications are understood as the models, tools, and activities that can provide actionable information to entities such as power systems operators for mitigating the effects of GIC and government agencies for managing any potential consequences from GIC impact to critical infrastructure. Applications can be considered the ultimate goal of our GIC work. In assessing the status of the field, we quantify the readiness of various applications in the mitigation context. We use the Applications Readiness Level (ARL) concept to carry out the quantification.
Earthquakes, September-October 1986
Person, W.J.
1987-01-01
There was one great earthquake (8.0 and above) during this reporting period in the South Pacific in the Kermadec Islands. There were no major earthquakes (7.0-7.9) but earthquake-related deaths were reported in Greece and in El Salvador. There were no destrcutive earthquakes in the United States.
NASA Astrophysics Data System (ADS)
Lin, Yueguan; Wang, Wei; Wen, Qi; Huang, He; Lin, Jingli; Zhang, Wei
2015-12-01
Ms8.0 Wenchuan earthquake that occurred on May 12, 2008 brought huge casualties and property losses to the Chinese people, and Beichuan County was destroyed in the earthquake. In order to leave a site for commemorate of the people, and for science propaganda and research of earthquake science, Beichuan National Earthquake Ruins Museum has been built on the ruins of Beichuan county. Based on the demand for digital preservation of the earthquake ruins park and collection of earthquake damage assessment of research and data needs, we set up a data set of Beichuan National Earthquake Ruins Museum, including satellite remote sensing image, airborne remote sensing image, ground photogrammetry data and ground acquisition data. At the same time, in order to make a better service for earthquake science research, we design the sharing ideas and schemes for this scientific data set.
Applications of CRISPR Genome Engineering in Cell Biology.
Wang, Fangyuan; Qi, Lei S
2016-11-01
Recent advances in genome engineering are starting a revolution in biological research and translational applications. The clustered regularly interspaced short palindromic repeats (CRISPR)-associated RNA-guided endonuclease CRISPR associated protein 9 (Cas9) and its variants enable diverse manipulations of genome function. In this review, we describe the development of Cas9 tools for a variety of applications in cell biology research, including the study of functional genomics, the creation of transgenic animal models, and genomic imaging. Novel genome engineering methods offer a new avenue to understand the causality between the genome and phenotype, thus promising a fuller understanding of cell biology. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Hutchinson, Lauren; Stead, Doug; Rosser, Nick
2017-04-01
Understanding the behaviour of rock slopes in response to earthquake shaking is instrumental in response and relief efforts following large earthquakes as well as to ongoing risk management in earthquake affected areas. Assessment of the effects of seismic shaking on rock slope kinematics requires detailed surveys of the pre- and post-earthquake condition of the slope; however, at present, there is a lack of high resolution monitoring data from pre- and post-earthquake to facilitate characterization of seismically induced slope damage and validate models used to back-analyze rock slope behaviour during and following earthquake shaking. Therefore, there is a need for additional research where pre- and post- earthquake monitoring data is available. This paper presents the results of a direct comparison between terrestrial laser scans (TLS) collected in 2014, the year prior to the 2015 earthquake sequence, with that collected 18 months after the earthquakes and two monsoon cycles. The two datasets were collected using Riegl VZ-1000 and VZ-4000 full waveform laser scanners with high resolution (c. 0.1 m point spacing as a minimum). The scans cover the full landslide affected slope from the toe to the crest. The slope is located in Sindhupalchok District, Central Nepal which experienced some of the highest co-seismic and post-seismic landslide intensities across Nepal due to the proximity to the epicenters (<20 km) of both of the main aftershocks on April 26, 2015 (M 6.7) and May 12, 2015 (M7.3). During the 2015 earthquakes and subsequent 2015 and 2016 monsoons, the slope experienced rockfall and debris flows which are evident in satellite imagery and field photographs. Fracturing of the rock mass associated with the seismic shaking is also evident at scales not accessible through satellite and field observations. The results of change detection between the TLS datasets with an emphasis on quantification of seismically-induced slope damage is presented. Patterns in the
An application of synthetic seismicity in earthquake statistics - The Middle America Trench
NASA Technical Reports Server (NTRS)
Ward, Steven N.
1992-01-01
The way in which seismicity calculations which are based on the concept of fault segmentation incorporate the physics of faulting through static dislocation theory can improve earthquake recurrence statistics and hone the probabilities of hazard is shown. For the Middle America Trench, the spread parameters of the best-fitting lognormal or Weibull distributions (about 0.75) are much larger than the 0.21 intrinsic spread proposed in the Nishenko Buland (1987) hypothesis. Stress interaction between fault segments disrupts time or slip predictability and causes earthquake recurrence to be far more aperiodic than has been suggested.
Post earthquake recovery in natural gas systems--1971 San Fernando Earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, W.T. Jr.
1983-01-01
In this paper a concise summary of the post earthquake investigations for the 1971 San Fernando Earthquake is presented. The effects of the earthquake upon building and other above ground structures are briefly discussed. Then the damages and subsequent repairs in the natural gas systems are reported.
Pharmaceutical and biomedical applications of surface engineered carbon nanotubes.
Mehra, Neelesh Kumar; Jain, Keerti; Jain, Narendra Kumar
2015-06-01
Surface engineered carbon nanotubes (CNTs) are attracting recent attention of scientists owing to their vivid biomedical and pharmaceutical applications. The focus of this review is to highlight the important role of surface engineered CNTs in the highly challenging but rewarding area of nanotechnology. The major strength of this review lies in highlighting the exciting applications of CNTs to boost the research efforts, which unfortunately are otherwise scattered in the literature making the reading non-coherent and non-homogeneous. Copyright © 2015 Elsevier Ltd. All rights reserved.
Short- and Long-Term Earthquake Forecasts Based on Statistical Models
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Taroni, Matteo; Murru, Maura; Falcone, Giuseppe; Marzocchi, Warner
2017-04-01
The epidemic-type aftershock sequences (ETAS) models have been experimentally used to forecast the space-time earthquake occurrence rate during the sequence that followed the 2009 L'Aquila earthquake and for the 2012 Emilia earthquake sequence. These forecasts represented the two first pioneering attempts to check the feasibility of providing operational earthquake forecasting (OEF) in Italy. After the 2009 L'Aquila earthquake the Italian Department of Civil Protection nominated an International Commission on Earthquake Forecasting (ICEF) for the development of the first official OEF in Italy that was implemented for testing purposes by the newly established "Centro di Pericolosità Sismica" (CPS, the seismic Hazard Center) at the Istituto Nazionale di Geofisica e Vulcanologia (INGV). According to the ICEF guidelines, the system is open, transparent, reproducible and testable. The scientific information delivered by OEF-Italy is shaped in different formats according to the interested stakeholders, such as scientists, national and regional authorities, and the general public. The communication to people is certainly the most challenging issue, and careful pilot tests are necessary to check the effectiveness of the communication strategy, before opening the information to the public. With regard to long-term time-dependent earthquake forecast, the application of a newly developed simulation algorithm to Calabria region provided typical features in time, space and magnitude behaviour of the seismicity, which can be compared with those of the real observations. These features include long-term pseudo-periodicity and clustering of strong earthquakes, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the moderate and higher magnitude range.
The Active Fault Parameters for Time-Dependent Earthquake Hazard Assessment in Taiwan
NASA Astrophysics Data System (ADS)
Lee, Y.; Cheng, C.; Lin, P.; Shao, K.; Wu, Y.; Shih, C.
2011-12-01
faults in Taiwan. By accomplishing active fault parameters table in Taiwan, we would apply it in time-dependent earthquake hazard assessment. The result can also give engineers a reference for design. Furthermore, it can be applied in the seismic hazard map to mitigate disasters.
Earthquakes; July-August, 1978
Person, W.J.
1979-01-01
Earthquake activity during this period was about normal. Deaths from earthquakes were reported from Greece and Guatemala. Three major earthquakes (magnitude 7.0-7.9) occurred in Taiwan, Chile, and Costa Rica. In the United States, the most significant earthquake was a magnitude 5.6 on August 13 in southern California.
Improvements of the offshore earthquake locations in the Earthquake Early Warning System
NASA Astrophysics Data System (ADS)
Chen, Ta-Yi; Hsu, Hsin-Chih
2017-04-01
Since 2014 the Earthworm Based Earthquake Alarm Reporting (eBEAR) system has been operated and been used to issue warnings to schools. In 2015 the system started to provide warnings to the public in Taiwan via television and the cell phone. Online performance of the eBEAR system indicated that the average reporting times afforded by the system are approximately 15 and 28 s for inland and offshore earthquakes, respectively. The eBEAR system in average can provide more warning time than the current EEW system (3.2 s and 5.5 s for inland and offshore earthquakes, respectively). However, offshore earthquakes were usually located poorly because only P-wave arrivals were used in the eBEAR system. Additionally, in the early stage of the earthquake early warning system, only fewer stations are available. The poor station coverage may be a reason to answer why offshore earthquakes are difficult to locate accurately. In the Geiger's inversion procedure of earthquake location, we need to put an initial hypocenter and origin time into the location program. For the initial hypocenter, we defined some test locations on the offshore area instead of using the average of locations from triggered stations. We performed 20 programs concurrently running the Geiger's method with different pre-defined initial position to locate earthquakes. We assume that if the program with the pre-defined initial position is close to the true earthquake location, during the iteration procedure of the Geiger's method the processing time of this program should be less than others. The results show that using pre-defined locations for trial-hypocenter in the inversion procedure is able to improve the accurate of offshore earthquakes. Especially for EEW system, in the initial stage of the EEW system, only use 3 or 5 stations to locate earthquakes may lead to bad results because of poor station coverage. In this study, the pre-defined trial-locations provide a feasible way to improve the estimations of
Status of Public Earthquake Early Warning in the U.S
NASA Astrophysics Data System (ADS)
Given, D. D.
2013-12-01
Earthquake Early Warning (EEW) is a proven use of seismological science that can give people and businesses outside the epicentral area of a large earthquake up to a minute to take protective actions before the most destructive shaking hits them. Since 2006 several organizations have been collaborating to create such a system in the United States. These groups include the US Geological Survey, Caltech, UC Berkeley, the University of Washington, the Southern California Earthquake Center, the Swiss Federal Institute of Technology, Zürich, the California Office of Emergency Services, and the California Geological Survey. A demonstration version of the system, called ShakeAlert, began sending test notifications to selected users in California in January 2012. In August 2012 San Francisco's Bay Area Rapid Transit district began slowing and stopping trains in response to strong ground shaking. The next step in the project is to progress to a production prototype for the west coast. The system is built on top of the considerable technical and organizational earthquake monitoring infrastructure of the Advanced National Seismic System (ANSS). While a fully functional, robust, public EEW system will require significant new investment and development in several major areas, modest progress is being made with current resources. First, high-quality sensors must be installed with sufficient density, particularly near source faults. Where possible, we are upgrading and augmenting the existing ANSS networks on the west coast. Second, data telemetry from those sensors must be engineered for speed and reliability. Next, robust central processing infrastructure is being designed and built. Also, computer algorithms to detect and characterize the evolving earthquake must be further developed and tested. Last year the Gordon and Betty Moore Foundation funded USGS, Caltech, UCB and UW to accelerate R&D efforts. Every available means of distributing alerts must be used to insure the
Megathrust earthquakes in Central Chile: What is next after the Maule 2010 earthquake?
NASA Astrophysics Data System (ADS)
Madariaga, R.
2013-05-01
The 27 February 2010 Maule earthquake occurred in a well identified gap in the Chilean subduction zone. The event has now been studied in detail using both far-field, near field seismic and geodetic data, we will review this information gathered so far. The event broke a region that was much longer along strike than the gap left over from the 1835 Concepcion earthquake, sometimes called the Darwin earthquake because he was in the area when the earthquake occurred and made many observations. Recent studies of contemporary documents by Udias et al indicate that the area broken by the Maule earthquake in 2010 had previously broken by a similar earthquake in 1751, but several events in the magnitude 8 range occurred in the area principally in 1835 already mentioned and, more recently on 1 December 1928 to the North and on 21 May 1960 (1 1/2 days before the big Chilean earthquake of 1960). Currently the area of the 2010 earthquake and the region immediately to the North is undergoing a very large increase in seismicity with numerous clusters of seismicity that move along the plate interface. Examination of the seismicity of Chile of the 18th and 19th century show that the region immediately to the North of the 2010 earthquake broke in a very large megathrust event in July 1730. this is the largest known earthquake in central Chile. The region where this event occurred has broken in many occasions with M 8 range earthquakes in 1822, 1880, 1906, 1971 and 1985. Is it preparing for a new very large megathrust event? The 1906 earthquake of Mw 8.3 filled the central part of the gap but it has broken again on several occasions in 1971, 1973 and 1985. The main question is whether the 1906 earthquake relieved enough stresses from the 1730 rupture zone. Geodetic data shows that most of the region that broke in 1730 is currently almost fully locked from the northern end of the Maule earthquake at 34.5°S to 30°S, near the southern end of the of the Mw 8.5 Atacama earthquake of 11
Performance of San Fernando dams during 1994 Northridge earthquake
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bardet, J.P.; Davis, C.A.
1996-07-01
The 1994 Northridge and 1971 San Fernando Earthquakes subjected the Lower and Upper San Fernando Dams of the Van Norman Complex in the San Fernando Valley, Calif., to strong near-source ground motions. In 1994, these earth dams, which were out of service and retained only a few meters of water, extensively cracked and settled due to the liquefaction of their hydraulic fill. The Lower San Fernando Dam moved over 15 cm upstream as the hydraulic fill liquefied beneath its upstream slope. The Upper San Fernando Dam moved even more and deformed in a complicated three-dimensional pattern. The responses of themore » Lower and Upper San Fernando Dams during the 1994 Northridge Earthquake, although less significant than in 1971, provide the geotechnical engineering community with two useful case histories.« less
Application of Remote Sensing in Building Damages Assessment after Moderate and Strong Earthquake
NASA Astrophysics Data System (ADS)
Tian, Y.; Zhang, J.; Dou, A.
2003-04-01
- Earthquake is a main natural disaster in modern society. However, we still cannot predict the time and place of its occurrence accurately. Then it is of much importance to survey the damages information when an earthquake occurs, which can help us to mitigate losses and implement fast damage evaluation. In this paper, we use remote sensing techniques for our purposes. Remotely sensed satellite images often view a large scale of land at a time. There are several kinds of satellite images, which of different spatial and spectral resolutions. Landsat-4/5 TM sensor can view ground at 30m resolution, while Landsat-7 ETM Plus has a resolution of 15m in panchromatic waveband. SPOT satellite can provide images with higher resolutions. Those images obtained pre- and post-earthquake can help us greatly in identifying damages of moderate and large-size buildings. In this paper, we bring forward a method to implement quick damages assessment by analyzing both pre- and post-earthquake satellite images. First, those images are geographically registered together with low RMS (Root Mean Square) error. Then, we clip out residential areas by overlaying images with existing vector layers through Geographic Information System (GIS) software. We present a new change detection algorithm to quantitatively identify damages degree. An empirical or semi-empirical model is then established by analyzing the real damage degree and changes of pixel values of the same ground objects. Experimental result shows that there is a good linear relationship between changes of pixel values and ground damages, which proves the potentials of remote sensing in post-quake fast damage assessment. Keywords: Damages Assessment, Earthquake Hazard, Remote Sensing
Permeability, storage and hydraulic diffusivity controlled by earthquakes
NASA Astrophysics Data System (ADS)
Brodsky, E. E.; Fulton, P. M.; Xue, L.
2016-12-01
Earthquakes can increase permeability in fractured rocks. In the farfield, such permeability increases are attributed to seismic waves and can last for months after the initial earthquake. Laboratory studies suggest that unclogging of fractures by the transient flow driven by seismic waves is a viable mechanism. These dynamic permeability increases may contribute to permeability enhancement in the seismic clouds accompanying hydraulic fracking. Permeability enhancement by seismic waves could potentially be engineered and the experiments suggest the process will be most effective at a preferred frequency. We have recently observed similar processes inside active fault zones after major earthquakes. A borehole observatory in the fault that generated the M9.0 2011 Tohoku earthquake reveals a sequence of temperature pulses during the secondary aftershock sequence of an M7.3 aftershock. The pulses are attributed to fluid advection by a flow through a zone of transiently increased permeability. Directly after the M7.3 earthquake, the newly damaged fault zone is highly susceptible to further permeability enhancement, but ultimately heals within a month and becomes no longer as sensitive. The observation suggests that the newly damaged fault zone is more prone to fluid pulsing than would be expected based on the long-term permeability structure. Even longer term healing is seen inside the fault zone of the 2008 M7.9 Wenchuan earthquake. The competition between damage and healing (or clogging and unclogging) results in dynamically controlled permeability, storage and hydraulic diffusivity. Recent measurements of in situ fault zone architecture at the 1-10 meter scale suggest that active fault zones often have hydraulic diffusivities near 10-2 m2/s. This uniformity is true even within the damage zone of the San Andreas fault where permeability and storage increases balance each other to achieve this value of diffusivity over a 400 m wide region. We speculate that fault zones
GLobal Integrated Design Environment (GLIDE): A Concurrent Engineering Application
NASA Technical Reports Server (NTRS)
McGuire, Melissa L.; Kunkel, Matthew R.; Smith, David A.
2010-01-01
The GLobal Integrated Design Environment (GLIDE) is a client-server software application purpose-built to mitigate issues associated with real time data sharing in concurrent engineering environments and to facilitate discipline-to-discipline interaction between multiple engineers and researchers. GLIDE is implemented in multiple programming languages utilizing standardized web protocols to enable secure parameter data sharing between engineers and researchers across the Internet in closed and/or widely distributed working environments. A well defined, HyperText Transfer Protocol (HTTP) based Application Programming Interface (API) to the GLIDE client/server environment enables users to interact with GLIDE, and each other, within common and familiar tools. One such common tool, Microsoft Excel (Microsoft Corporation), paired with its add-in API for GLIDE, is discussed in this paper. The top-level examples given demonstrate how this interface improves the efficiency of the design process of a concurrent engineering study while reducing potential errors associated with manually sharing information between study participants.
40 CFR 86.1851-01 - Application of good engineering judgment to manufacturers' decisions.
Code of Federal Regulations, 2010 CFR
2010-07-01
... 40 Protection of Environment 19 2010-07-01 2010-07-01 false Application of good engineering... Application of good engineering judgment to manufacturers' decisions. (a) The manufacturer shall exercise good engineering judgment in making all decisions called for under this subpart, including but not limited to...
40 CFR 86.1851-01 - Application of good engineering judgment to manufacturers' decisions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... 40 Protection of Environment 19 2011-07-01 2011-07-01 false Application of good engineering... Application of good engineering judgment to manufacturers' decisions. (a) The manufacturer shall exercise good engineering judgment in making all decisions called for under this subpart, including but not limited to...
40 CFR 86.1851-01 - Application of good engineering judgment to manufacturers' decisions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 20 2012-07-01 2012-07-01 false Application of good engineering... Application of good engineering judgment to manufacturers' decisions. (a) The manufacturer shall exercise good engineering judgment in making all decisions called for under this subpart, including but not limited to...
40 CFR 86.1851-01 - Application of good engineering judgment to manufacturers' decisions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 20 2013-07-01 2013-07-01 false Application of good engineering... Application of good engineering judgment to manufacturers' decisions. (a) The manufacturer shall exercise good engineering judgment in making all decisions called for under this subpart, including but not limited to...
ESPACE - a geodetic Master's program for the education of Satellite Application Engineers
NASA Astrophysics Data System (ADS)
Hedman, K.; Kirschner, S.; Seitz, F.
2012-04-01
In the last decades there has been a rapid development of new geodetic and other Earth observation satellites. Applications of these satellites such as car navigation systems, weather predictions, and, digital maps (such as Google Earth or Google Maps) play a more and more important role in our daily life. For geosciences, satellite applications such as remote sensing and precise positioning/navigation have turned out to be extremely useful and are meanwhile indispensable. Today, researchers within geodesy, climatology, oceanography, meteorology as well as within Earth system science are all dependent on up-to-date satellite data. Design, development and handling of these missions require experts with knowledge not only in space engineering, but also in the specific applications. That gives rise to a new kind of engineers - satellite application engineers. The study program for these engineers combines parts of different classical disciplines such as geodesy, aerospace engineering or electronic engineering. The satellite application engineering program Earth Oriented Space Science and Technology (ESPACE) was founded in 2005 at the Technische Universität München, mainly from institutions involved in geodesy and aerospace engineering. It is an international, interdisciplinary Master's program, and is open to students with a BSc in both Science (e.g. Geodesy, Mathematics, Informatics, Geophysics) and Engineering (e.g. Aerospace, Electronical and Mechanical Engineering). The program is completely conducted in English. ESPACE benefits from and utilizes its location in Munich with its unique concentration of expertise related to space science and technology. Teaching staff from 3 universities (Technische Universität München, Ludwig-Maximilian University, University of the Federal Armed Forces), research institutions (such as the German Aerospace Center, DLR and the German Geodetic Research Institute, DGFI) and space industry (such as EADS or Kayser-Threde) are
Earthquake location in island arcs
Engdahl, E.R.; Dewey, J.W.; Fujita, K.
1982-01-01
-velocity lithospheric slab. In application, JHD has the practical advantage that it does not require the specification of a theoretical velocity model for the slab. Considering earthquakes within a 260 km long by 60 km wide section of the Aleutian main thrust zone, our results suggest that the theoretical velocity structure of the slab is presently not sufficiently well known that accurate locations can be obtained independently of locally recorded data. Using a locally recorded earthquake as a calibration event, JHD gave excellent results over the entire section of the main thrust zone here studied, without showing a strong effect that might be attributed to spatially varying source-station anomalies. We also calibrated the ray-tracing method using locally recorded data and obtained results generally similar to those obtained by JHD. ?? 1982.
Earthquake dating: an application of carbon-14 atom counting.
Tucker, A B; Woefli, W; Bonani, G; Suter, M
1983-03-18
Milligram-sized specimens of detrital charcoal from soil layers associated with prehistoric earthquakes on the Wasatch fault in Utah have been dated by direct atom counting of carbon-14 with a tandem Van de Graaff accelerator. The measured ratios of carbon-14 to carbon-12 correspond to ages of 7800, 8800, and 9000 years with uncertainties of +/- 600 years.
Person, W.J.
1975-01-01
There were no major earthquakes (magnitude 7.0-7.9) in March or April; however, there were earthquake fatalities in Chile, Iran, and Venezuela and approximately 35 earthquake-related injuries were reported around the world. In the United States a magnitude 6.0 earthquake struck the Idaho-Utah border region. Damage was estimated at about a million dollars. The shock was felt over a wide area and was the largest to hit the continental Untied States since the San Fernando earthquake of February 1971.
Assessment of Structural Resistance of building 4862 to Earthquake and Tornado Forces [SEC 1 and 2
DOE Office of Scientific and Technical Information (OSTI.GOV)
METCALF, I.L.
1999-12-06
This report presents the results of work done for Hanford Engineering Laboratory under contract Y213-544-12662. LATA performed an assessment of building 4862 resistance to earthquake and tornado forces.
Association of earthquakes and faults in the San Francisco Bay area using Bayesian inference
Wesson, R.L.; Bakun, W.H.; Perkins, D.M.
2003-01-01
Bayesian inference provides a method to use seismic intensity data or instrumental locations, together with geologic and seismologic data, to make quantitative estimates of the probabilities that specific past earthquakes are associated with specific faults. Probability density functions are constructed for the location of each earthquake, and these are combined with prior probabilities through Bayes' theorem to estimate the probability that an earthquake is associated with a specific fault. Results using this method are presented here for large, preinstrumental, historical earthquakes and for recent earthquakes with instrumental locations in the San Francisco Bay region. The probabilities for individual earthquakes can be summed to construct a probabilistic frequency-magnitude relationship for a fault segment. Other applications of the technique include the estimation of the probability of background earthquakes, that is, earthquakes not associated with known or considered faults, and the estimation of the fraction of the total seismic moment associated with earthquakes less than the characteristic magnitude. Results for the San Francisco Bay region suggest that potentially damaging earthquakes with magnitudes less than the characteristic magnitudes should be expected. Comparisons of earthquake locations and the surface traces of active faults as determined from geologic data show significant disparities, indicating that a complete understanding of the relationship between earthquakes and faults remains elusive.
Events | Pacific Earthquake Engineering Research Center
home about peer news events research products laboratories publications nisee b.i.p. members education FAQs links Events Calendar of PEER and Other Events PEER Events Archive PEER Annual Meeting 2009 Experimental Structural Engineering PEER Summative Meeting Site Map Search Calendar of PEER and Other Events
Protecting your family from earthquakes: The seven steps to earthquake safety
Developed by American Red Cross, Asian Pacific Fund
2007-01-01
This book is provided here because of the importance of preparing for earthquakes before they happen. Experts say it is very likely there will be a damaging San Francisco Bay Area earthquake in the next 30 years and that it will strike without warning. It may be hard to find the supplies and services we need after this earthquake. For example, hospitals may have more patients than they can treat, and grocery stores may be closed for weeks. You will need to provide for your family until help arrives. To keep our loved ones and our community safe, we must prepare now. Some of us come from places where earthquakes are also common. However, the dangers of earthquakes in our homelands may be very different than in the Bay Area. For example, many people in Asian countries die in major earthquakes when buildings collapse or from big sea waves called tsunami. In the Bay Area, the main danger is from objects inside buildings falling on people. Take action now to make sure your family will be safe in an earthquake. The first step is to read this book carefully and follow its advice. By making your home safer, you help make our community safer. Preparing for earthquakes is important, and together we can make sure our families and community are ready. English version p. 3-13 Chinese version p. 14-24 Vietnamese version p. 25-36 Korean version p. 37-48
Ion engine auxiliary propulsion applications and integration study
NASA Technical Reports Server (NTRS)
Zafran, S. (Editor)
1977-01-01
The benefits derived from application of the 8-cm mercury electron bombardment ion thruster were assessed. Two specific spacecraft missions were studied. A thruster was tested to provide additional needed information on its efflux characteristics and interactive effects. A Users Manual was then prepared describing how to integrate the thruster for auxiliary propulsion on geosynchronous satellites. By incorporating ion engines on an advanced communications mission, the weight available for added payload increases by about 82 kg (181 lb) for a 100 kg (2200 lb) satellite which otherwise uses electrothermal hydrazine. Ion engines can be integrated into a high performance propulsion module that is compatible with the multimission modular spacecraft and can be used for both geosynchronous and low earth orbit applications. The low disturbance torques introduced by the ion engines permit accurate spacecraft pointing with the payload in operation during thrusting periods. The feasibility of using the thruster's neutralizer assembly for neutralization of differentially charged spacecraft surfaces at geosynchronous altitude was demonstrated during the testing program.
Directivity in NGA earthquake ground motions: Analysis using isochrone theory
Spudich, P.; Chiou, B.S.J.
2008-01-01
We present correction factors that may be applied to the ground motion prediction relations of Abrahamson and Silva, Boore and Atkinson, Campbell and Bozorgnia, and Chiou and Youngs (all in this volume) to model the azimuthally varying distribution of the GMRotI50 component of ground motion (commonly called 'directivity') around earthquakes. Our correction factors may be used for planar or nonplanar faults having any dip or slip rake (faulting mechanism). Our correction factors predict directivity-induced variations of spectral acceleration that are roughly half of the strike-slip variations predicted by Somerville et al. (1997), and use of our factors reduces record-to-record sigma by about 2-20% at 5 sec or greater period. ?? 2008, Earthquake Engineering Research Institute.
Earthquake Triggering in the September 2017 Mexican Earthquake Sequence
NASA Astrophysics Data System (ADS)
Fielding, E. J.; Gombert, B.; Duputel, Z.; Huang, M. H.; Liang, C.; Bekaert, D. P.; Moore, A. W.; Liu, Z.; Ampuero, J. P.
2017-12-01
Southern Mexico was struck by four earthquakes with Mw > 6 and numerous smaller earthquakes in September 2017, starting with the 8 September Mw 8.2 Tehuantepec earthquake beneath the Gulf of Tehuantepec offshore Chiapas and Oaxaca. We study whether this M8.2 earthquake triggered the three subsequent large M>6 quakes in southern Mexico to improve understanding of earthquake interactions and time-dependent risk. All four large earthquakes were extensional despite the the subduction of the Cocos plate. The traditional definition of aftershocks: likely an aftershock if it occurs within two rupture lengths of the main shock soon afterwards. Two Mw 6.1 earthquakes, one half an hour after the M8.2 beneath the Tehuantepec gulf and one on 23 September near Ixtepec in Oaxaca, both fit as traditional aftershocks, within 200 km of the main rupture. The 19 September Mw 7.1 Puebla earthquake was 600 km away from the M8.2 shock, outside the standard aftershock zone. Geodetic measurements from interferometric analysis of synthetic aperture radar (InSAR) and time-series analysis of GPS station data constrain finite fault total slip models for the M8.2, M7.1, and M6.1 Ixtepec earthquakes. The early M6.1 aftershock was too close in time and space to the M8.2 to measure with InSAR or GPS. We analyzed InSAR data from Copernicus Sentinel-1A and -1B satellites and JAXA ALOS-2 satellite. Our preliminary geodetic slip model for the M8.2 quake shows significant slip extended > 150 km NW from the hypocenter, longer than slip in the v1 finite-fault model (FFM) from teleseismic waveforms posted by G. Hayes at USGS NEIC. Our slip model for the M7.1 earthquake is similar to the v2 NEIC FFM. Interferograms for the M6.1 Ixtepec quake confirm the shallow depth in the upper-plate crust and show centroid is about 30 km SW of the NEIC epicenter, a significant NEIC location bias, but consistent with cluster relocations (E. Bergman, pers. comm.) and with Mexican SSN location. Coulomb static stress
Wheeler, Russell L.
2014-01-01
Computation of probabilistic earthquake hazard requires an estimate of Mmax, the maximum earthquake magnitude thought to be possible within a specified geographic region. This report is Part A of an Open-File Report that describes the construction of a global catalog of moderate to large earthquakes, from which one can estimate Mmax for most of the Central and Eastern United States and adjacent Canada. The catalog and Mmax estimates derived from it were used in the 2014 edition of the U.S. Geological Survey national seismic-hazard maps. This Part A discusses prehistoric earthquakes that occurred in eastern North America, northwestern Europe, and Australia, whereas a separate Part B deals with historical events.
Visible Earthquakes: a web-based tool for visualizing and modeling InSAR earthquake data
NASA Astrophysics Data System (ADS)
Funning, G. J.; Cockett, R.
2012-12-01
InSAR (Interferometric Synthetic Aperture Radar) is a technique for measuring the deformation of the ground using satellite radar data. One of the principal applications of this method is in the study of earthquakes; in the past 20 years over 70 earthquakes have been studied in this way, and forthcoming satellite missions promise to enable the routine and timely study of events in the future. Despite the utility of the technique and its widespread adoption by the research community, InSAR does not feature in the teaching curricula of most university geoscience departments. This is, we believe, due to a lack of accessibility to software and data. Existing tools for the visualization and modeling of interferograms are often research-oriented, command line-based and/or prohibitively expensive. Here we present a new web-based interactive tool for comparing real InSAR data with simple elastic models. The overall design of this tool was focused on ease of access and use. This tool should allow interested nonspecialists to gain a feel for the use of such data and greatly facilitate integration of InSAR into upper division geoscience courses, giving students practice in comparing actual data to modeled results. The tool, provisionally named 'Visible Earthquakes', uses web-based technologies to instantly render the displacement field that would be observable using InSAR for a given fault location, geometry, orientation, and slip. The user can adjust these 'source parameters' using a simple, clickable interface, and see how these affect the resulting model interferogram. By visually matching the model interferogram to a real earthquake interferogram (processed separately and included in the web tool) a user can produce their own estimates of the earthquake's source parameters. Once satisfied with the fit of their models, users can submit their results and see how they compare with the distribution of all other contributed earthquake models, as well as the mean and median
The initial subevent of the 1994 Northridge, California, earthquake: Is earthquake size predictable?
Kilb, Debi; Gomberg, J.
1999-01-01
We examine the initial subevent (ISE) of the M?? 6.7, 1994 Northridge, California, earthquake in order to discriminate between two end-member rupture initiation models: the 'preslip' and 'cascade' models. Final earthquake size may be predictable from an ISE's seismic signature in the preslip model but not in the cascade model. In the cascade model ISEs are simply small earthquakes that can be described as purely dynamic ruptures. In this model a large earthquake is triggered by smaller earthquakes; there is no size scaling between triggering and triggered events and a variety of stress transfer mechanisms are possible. Alternatively, in the preslip model, a large earthquake nucleates as an aseismically slipping patch in which the patch dimension grows and scales with the earthquake's ultimate size; the byproduct of this loading process is the ISE. In this model, the duration of the ISE signal scales with the ultimate size of the earthquake, suggesting that nucleation and earthquake size are determined by a more predictable, measurable, and organized process. To distinguish between these two end-member models we use short period seismograms recorded by the Southern California Seismic Network. We address questions regarding the similarity in hypocenter locations and focal mechanisms of the ISE and the mainshock. We also compare the ISE's waveform characteristics to those of small earthquakes and to the beginnings of earthquakes with a range of magnitudes. We find that the focal mechanisms of the ISE and mainshock are indistinguishable, and both events may have nucleated on and ruptured the same fault plane. These results satisfy the requirements for both models and thus do not discriminate between them. However, further tests show the ISE's waveform characteristics are similar to those of typical small earthquakes in the vicinity and more importantly, do not scale with the mainshock magnitude. These results are more consistent with the cascade model.
Engineering growth factors for regenerative medicine applications.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitchell, Aaron C.; Briquez, Priscilla S.; Hubbell, Jeffrey A.
Growth factors are important morphogenetic proteins that instruct cell behavior and guide tissue repair and renewal. Although their therapeutic potential holds great promise in regenerative medicine applications, translation of growth factors into clinical treatments has been hindered by limitations including poor protein stability, low recombinant expression yield, and suboptimal efficacy. This review highlights current tools, technologies, and approaches to design integrated and effective growth factor-based therapies for regenerative medicine applications. The first section describes rational and combinatorial protein engineering approaches that have been utilized to improve growth factor stability, expression yield, biodistribution, and serum half-life, or alter their cell traffickingmore » behavior or receptor binding affinity. The second section highlights elegant biomaterial-based systems, inspired by the natural extracellular matrix milieu, that have been developed for effective spatial and temporal delivery of growth factors to cell surface receptors. Although appearing distinct, these two approaches are highly complementary and involve principles of molecular design and engineering to be considered in parallel when developing optimal materials for clinical applications.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shinozuka, M.; Rose, A.; Eguchi, R.T.
1998-12-31
This monograph examines the potential effects of a repeat of the New Madrid earthquake to the metropolitan Memphis area. The authors developed a case study of the impact of such an event to the electric power system, and analyzed how this disruption would affect society. In nine chapters and 189 pages, the book traces the impacts of catastrophic earthquakes through a curtailment of utility lifeline services to its host regional economy and beyond. the monographs` chapters include: Modeling the Memphis economy; seismic performance of electric power systems; spatial analysis techniques for linking physical damage to economic functions; earthquake vulnerability andmore » emergency preparedness among businesses; direct economic impacts; regional economic impacts; socioeconomic and interregional impacts; lifeline risk reduction; and public policy formulation and implementation.« less
Source Model of Huge Subduction Earthquakes for Strong Ground Motion Prediction
NASA Astrophysics Data System (ADS)
Iwata, T.; Asano, K.
2012-12-01
It is a quite important issue for strong ground motion prediction to construct the source model of huge subduction earthquakes. Irikura and Miyake (2001, 2011) proposed the characterized source model for strong ground motion prediction, which consists of plural strong ground motion generation area (SMGA, Miyake et al., 2003) patches on the source fault. We obtained the SMGA source models for many events using the empirical Green's function method and found the SMGA size has an empirical scaling relationship with seismic moment. Therefore, the SMGA size can be assumed from that empirical relation under giving the seismic moment for anticipated earthquakes. Concerning to the setting of the SMGAs position, the information of the fault segment is useful for inland crustal earthquakes. For the 1995 Kobe earthquake, three SMGA patches are obtained and each Nojima, Suma, and Suwayama segment respectively has one SMGA from the SMGA modeling (e.g. Kamae and Irikura, 1998). For the 2011 Tohoku earthquake, Asano and Iwata (2012) estimated the SMGA source model and obtained four SMGA patches on the source fault. Total SMGA area follows the extension of the empirical scaling relationship between the seismic moment and the SMGA area for subduction plate-boundary earthquakes, and it shows the applicability of the empirical scaling relationship for the SMGA. The positions of two SMGAs are in Miyagi-Oki segment and those other two SMGAs are in Fukushima-Oki and Ibaraki-Oki segments, respectively. Asano and Iwata (2012) also pointed out that all SMGAs are corresponding to the historical source areas of 1930's. Those SMGAs do not overlap the huge slip area in the shallower part of the source fault which estimated by teleseismic data, long-period strong motion data, and/or geodetic data during the 2011 mainshock. This fact shows the huge slip area does not contribute to strong ground motion generation (10-0.1s). The information of the fault segment in the subduction zone, or
Selected engineering properties and applications of EPS geofoam
NASA Astrophysics Data System (ADS)
Elragi, Ahmed Fouad
Expanded polystyrene (EPS) geofoam is a lightweight material that has been used in engineering applications since at least the 1950s. Its density is about a hundredth of that of soil. It has good thermal insulation properties with stiffness and compression strength comparable to medium clay. It is utilized in reducing settlement below embankments, sound and vibration damping, reducing lateral pressure on substructures, reducing stresses on rigid buried conduits and related applications. This study starts with an overview on EPS geofoam. EPS manufacturing processes are described followed by a review of engineering properties found in previous research work done so far. Standards and design manuals applicable to EPS are presented. Selected EPS geofoam-engineering applications are discussed with examples. State-of-the-art of experimental work is done on different sizes of EPS specimens under different loading rates for better understanding of the behavior of the material. The effects of creep, sample size, strain rate and cyclic loading on the stress strain response are studied. Equations for the initial modulus and the strength of the material under compression for different strain rates are presented. The initial modulus and Poisson's ratio are discussed in detail. Sample size effect on creep behavior is examined. Three EPS projects are shown in this study. The creep behavior of the largest EPS geofoam embankment fill is shown. Results from laboratory tests, mathematical modeling and field records are compared to each other. Field records of a geofoam-stabilized slope are compared to finite difference analysis results. Lateral stress reduction on an EPS backfill retaining structure is analyzed. The study ends with a discussion on two promising properties of EPS geofoam. These are the damping ability and the compressibility of this material. Finite element analysis, finite difference analysis and lab results are included in this discussion. The discussion with the
Identification of Deep Earthquakes
2010-09-01
discriminants that will reliably separate small, crustal earthquakes (magnitudes less than about 4 and depths less than about 40 to 50 km) from small...characteristics on discrimination plots designed to separate nuclear explosions from crustal earthquakes. Thus, reliably flagging these small, deep events is...Further, reliably identifying subcrustal earthquakes will allow us to eliminate deep events (previously misidentified as crustal earthquakes) from
Bao, Yuping; Wen, Tianlong; Samia, Anna Cristina S.; Khandhar, Amit; Krishnan, Kannan M.
2015-01-01
We present an interdisciplinary overview of material engineering and emerging applications of iron oxide nanoparticles. We discuss material engineering of nanoparticles in the broadest sense, emphasizing size and shape control, large-area self-assembly, composite/hybrid structures, and surface engineering. This is followed by a discussion of several non-traditional, emerging applications of iron oxide nanoparticles, including nanoparticle lithography, magnetic particle imaging, magnetic guided drug delivery, and positive contrast agents for magnetic resonance imaging. We conclude with a succinct discussion of the pharmacokinetics pathways of iron oxide nanoparticles in the human body –– an important and required practical consideration for any in vivo biomedical application, followed by a brief outlook of the field. PMID:26586919
Bao, Yuping; Wen, Tianlong; Samia, Anna Cristina S; Khandhar, Amit; Krishnan, Kannan M
2016-01-01
We present an interdisciplinary overview of material engineering and emerging applications of iron oxide nanoparticles. We discuss material engineering of nanoparticles in the broadest sense, emphasizing size and shape control, large-area self-assembly, composite/hybrid structures, and surface engineering. This is followed by a discussion of several non-traditional, emerging applications of iron oxide nanoparticles, including nanoparticle lithography, magnetic particle imaging, magnetic guided drug delivery, and positive contrast agents for magnetic resonance imaging. We conclude with a succinct discussion of the pharmacokinetics pathways of iron oxide nanoparticles in the human body -- an important and required practical consideration for any in vivo biomedical application, followed by a brief outlook of the field.
The use of earthquake rate changes as a stress meter at Kilauea volcano.
Dieterich, J; Cayol, V; Okubo, P
2000-11-23
Stress changes in the Earth's crust are generally estimated from model calculations that use near-surface deformation as an observational constraint. But the widespread correlation of changes of earthquake activity with stress has led to suggestions that stress changes might be calculated from earthquake occurrence rates obtained from seismicity catalogues. Although this possibility has considerable appeal, because seismicity data are routinely collected and have good spatial and temporal resolution, the method has not yet proven successful, owing to the non-linearity of earthquake rate changes with respect to both stress and time. Here, however, we present two methods for inverting earthquake rate data to infer stress changes, using a formulation for the stress- and time-dependence of earthquake rates. Application of these methods at Kilauea volcano, in Hawaii, yields good agreement with independent estimates, indicating that earthquake rates can provide a practical remote-sensing stress meter.
Effect of GNSS receiver carrier phase tracking loops on earthquake monitoring performance
NASA Astrophysics Data System (ADS)
Clare, Adam; Lin, Tao; Lachapelle, Gérard
2017-06-01
This research focuses on the performance of GNSS receiver carrier phase tracking loops for early earthquake monitoring systems. An earthquake was simulated using a hardware simulator and position, velocity and acceleration displacements were obtained to recreate the dynamics of the 2011 Tohoku earthquake. Using a software defined receiver, GSNRx, tracking bandwidths of 5, 10, 15, 20, 30, 40 and 50 Hz along with integration times of 1, 5 and 10 ms were tested. Using the phase lock indicator, an adaptive tracking loop was designed and tested to maximize performance for this application.
The 1985 central chile earthquake: a repeat of previous great earthquakes in the region?
Comte, D; Eisenberg, A; Lorca, E; Pardo, M; Ponce, L; Saragoni, R; Singh, S K; Suárez, G
1986-07-25
A great earthquake (surface-wave magnitude, 7.8) occurred along the coast of central Chile on 3 March 1985, causing heavy damage to coastal towns. Intense foreshock activity near the epicenter of the main shock occurred for 11 days before the earthquake. The aftershocks of the 1985 earthquake define a rupture area of 170 by 110 square kilometers. The earthquake was forecast on the basis of the nearly constant repeat time (83 +/- 9 years) of great earthquakes in this region. An analysis of previous earthquakes suggests that the rupture lengths of great shocks in the region vary by a factor of about 3. The nearly constant repeat time and variable rupture lengths cannot be reconciled with time- or slip-predictable models of earthquake recurrence. The great earthquakes in the region seem to involve a variable rupture mode and yet, for unknown reasons, remain periodic. Historical data suggest that the region south of the 1985 rupture zone should now be considered a gap of high seismic potential that may rupture in a great earthquake in the next few tens of years.
Tsunami hazard assessments with consideration of uncertain earthquakes characteristics
NASA Astrophysics Data System (ADS)
Sepulveda, I.; Liu, P. L. F.; Grigoriu, M. D.; Pritchard, M. E.
2017-12-01
The uncertainty quantification of tsunami assessments due to uncertain earthquake characteristics faces important challenges. First, the generated earthquake samples must be consistent with the properties observed in past events. Second, it must adopt an uncertainty propagation method to determine tsunami uncertainties with a feasible computational cost. In this study we propose a new methodology, which improves the existing tsunami uncertainty assessment methods. The methodology considers two uncertain earthquake characteristics, the slip distribution and location. First, the methodology considers the generation of consistent earthquake slip samples by means of a Karhunen Loeve (K-L) expansion and a translation process (Grigoriu, 2012), applicable to any non-rectangular rupture area and marginal probability distribution. The K-L expansion was recently applied by Le Veque et al. (2016). We have extended the methodology by analyzing accuracy criteria in terms of the tsunami initial conditions. Furthermore, and unlike this reference, we preserve the original probability properties of the slip distribution, by avoiding post sampling treatments such as earthquake slip scaling. Our approach is analyzed and justified in the framework of the present study. Second, the methodology uses a Stochastic Reduced Order model (SROM) (Grigoriu, 2009) instead of a classic Monte Carlo simulation, which reduces the computational cost of the uncertainty propagation. The methodology is applied on a real case. We study tsunamis generated at the site of the 2014 Chilean earthquake. We generate earthquake samples with expected magnitude Mw 8. We first demonstrate that the stochastic approach of our study generates consistent earthquake samples with respect to the target probability laws. We also show that the results obtained from SROM are more accurate than classic Monte Carlo simulations. We finally validate the methodology by comparing the simulated tsunamis and the tsunami records for
Some Applications of Gröbner Bases in Robotics and Engineering
NASA Astrophysics Data System (ADS)
Abłamowicz, Rafał
Gröbner bases in polynomial rings have numerous applications in geometry, applied mathematics, and engineering. We show a few applications of Gröbner bases in robotics, formulated in the language of Clifford algebras, and in engineering to the theory of curves, including Fermat and Bézier cubics, and interpolation functions used in finite element theory.
Piatanesi, A.; Cirella, A.; Spudich, P.; Cocco, M.
2007-01-01
We present a two-stage nonlinear technique to invert strong motions records and geodetic data to retrieve the rupture history of an earthquake on a finite fault. To account for the actual rupture complexity, the fault parameters are spatially variable peak slip velocity, slip direction, rupture time and risetime. The unknown parameters are given at the nodes of the subfaults, whereas the parameters within a subfault are allowed to vary through a bilinear interpolation of the nodal values. The forward modeling is performed with a discrete wave number technique, whose Green's functions include the complete response of the vertically varying Earth structure. During the first stage, an algorithm based on the heat-bath simulated annealing generates an ensemble of models that efficiently sample the good data-fitting regions of parameter space. In the second stage (appraisal), the algorithm performs a statistical analysis of the model ensemble and computes a weighted mean model and its standard deviation. This technique, rather than simply looking at the best model, extracts the most stable features of the earthquake rupture that are consistent with the data and gives an estimate of the variability of each model parameter. We present some synthetic tests to show the effectiveness of the method and its robustness to uncertainty of the adopted crustal model. Finally, we apply this inverse technique to the well recorded 2000 western Tottori, Japan, earthquake (Mw 6.6); we confirm that the rupture process is characterized by large slip (3-4 m) at very shallow depths but, differently from previous studies, we imaged a new slip patch (2-2.5 m) located deeper, between 14 and 18 km depth. Copyright 2007 by the American Geophysical Union.
NASA Astrophysics Data System (ADS)
Trugman, Daniel T.; Shearer, Peter M.
2017-04-01
Earthquake source spectra contain fundamental information about the dynamics of earthquake rupture. However, the inherent tradeoffs in separating source and path effects, when combined with limitations in recorded signal bandwidth, make it challenging to obtain reliable source spectral estimates for large earthquake data sets. We present here a stable and statistically robust spectral decomposition method that iteratively partitions the observed waveform spectra into source, receiver, and path terms. Unlike previous methods of its kind, our new approach provides formal uncertainty estimates and does not assume self-similar scaling in earthquake source properties. Its computational efficiency allows us to examine large data sets (tens of thousands of earthquakes) that would be impractical to analyze using standard empirical Green's function-based approaches. We apply the spectral decomposition technique to P wave spectra from five areas of active contemporary seismicity in Southern California: the Yuha Desert, the San Jacinto Fault, and the Big Bear, Landers, and Hector Mine regions of the Mojave Desert. We show that the source spectra are generally consistent with an increase in median Brune-type stress drop with seismic moment but that this observed deviation from self-similar scaling is both model dependent and varies in strength from region to region. We also present evidence for significant variations in median stress drop and stress drop variability on regional and local length scales. These results both contribute to our current understanding of earthquake source physics and have practical implications for the next generation of ground motion prediction assessments.
The ShakeOut scenario: A hypothetical Mw7.8 earthquake on the Southern San Andreas Fault
Porter, K.; Jones, L.; Cox, D.; Goltz, J.; Hudnut, K.; Mileti, D.; Perry, S.; Ponti, D.; Reichle, M.; Rose, A.Z.; Scawthorn, C.R.; Seligson, H.A.; Shoaf, K.I.; Treiman, J.; Wein, A.
2011-01-01
In 2008, an earthquake-planning scenario document was released by the U.S. Geological Survey (USGS) and California Geological Survey that hypothesizes the occurrence and effects of a Mw7.8 earthquake on the southern San Andreas Fault. It was created by more than 300 scientists and engineers. Fault offsets reach 13 m and up to 8 m at lifeline crossings. Physics-based modeling was used to generate maps of shaking intensity, with peak ground velocities of 3 m/sec near the fault and exceeding 0.5 m/sec over 10,000 km2. A custom HAZUS??MH analysis and 18 special studies were performed to characterize the effects of the earthquake on the built environment. The scenario posits 1,800 deaths and 53,000 injuries requiring emergency room care. Approximately 1,600 fires are ignited, resulting in the destruction of 200 million square feet of the building stock, the equivalent of 133,000 single-family homes. Fire contributes $87 billion in property and business interruption loss, out of the total $191 billion in economic loss, with most of the rest coming from shakerelated building and content damage ($46 billion) and business interruption loss from water outages ($24 billion). Emergency response activities are depicted in detail, in an innovative grid showing activities versus time, a new format introduced in this study. ?? 2011, Earthquake Engineering Research Institute.
Effect of water content on stability of landslides triggered by earthquakes
NASA Astrophysics Data System (ADS)
Beyabanaki, S.; Bagtzoglou, A. C.; Anagnostou, E. N.
2013-12-01
during rainfall is investigated. In this study, after different durations of rainfall, an earthquake is applied to the model and the elapsed time in which the FS gets less than one obtains by trial and error. The results for different initial water contents and earthquake acceleration coefficients show that landslides can happen after shorter rainfall duration when water content is greater. If water content is high enough, the landslide occurs even without rainfall. References [1] Ray RL, Jacobs JM, de Alba P. Impact of unsaturated zone soil moisture and groundwater table on slope instability. J. Geotech. Geoenviron. Eng., 2010, 136(10):1448-1458. [2] Das B. Principles of Foundation Engineering. Stanford, Cengage Learning, 2011. Fig. 1. Effect of initial water content on FS for different EACs
Boatwright, J.; Bundock, H.; Seekins, L.C.
2006-01-01
We derive and test relations between the Modified Mercalli Intensity (MMI) and the pseudo-acceleration response spectra at 1.0 and 0.3 s - SA(1.0 s) and SA(0.3 s) - in order to map response spectral ordinates for the 1906 San Francisco earthquake. Recent analyses of intensity have shown that MMI ??? 6 correlates both with peak ground velocity and with response spectra for periods from 0.5 to 3.0 s. We use these recent results to derive a linear relation between MMI and log SA(1.0 s), and we refine this relation by comparing the SA(1.0 s) estimated from Boatwright and Bundock's (2005) MMI map for the 1906 earthquake to the SA(1.0 s) calculated from recordings of the 1989 Loma Prieta earthquake. South of San Jose, the intensity distributions for the 1906 and 1989 earthquakes are remarkably similar, despite the difference in magnitude and rupture extent between the two events. We use recent strong motion regressions to derive a relation between SA(1.0 s) and SA(0.3 s) for a M7.8 strike-slip earthquake that depends on soil type, acceleration level, and source distance. We test this relation by comparing SA(0.3 s) estimated for the 1906 earthquake to SA(0.3 s) calculated from recordings of both the 1989 Loma Prieta and 1994 Northridge earthquakes, as functions of distance from the fault. ?? 2006, Earthquake Engineering Research Institute.
Marano, K.D.; Wald, D.J.; Allen, T.I.
2010-01-01
This study presents a quantitative and geospatial description of global losses due to earthquake-induced secondary effects, including landslide, liquefaction, tsunami, and fire for events during the past 40 years. These processes are of great importance to the US Geological Survey's (USGS) Prompt Assessment of Global Earthquakes for Response (PAGER) system, which is currently being developed to deliver rapid earthquake impact and loss assessments following large/significant global earthquakes. An important question is how dominant are losses due to secondary effects (and under what conditions, and in which regions)? Thus, which of these effects should receive higher priority research efforts in order to enhance PAGER's overall assessment of earthquakes losses and alerting for the likelihood of secondary impacts? We find that while 21.5% of fatal earthquakes have deaths due to secondary (non-shaking) causes, only rarely are secondary effects the main cause of fatalities. The recent 2004 Great Sumatra-Andaman Islands earthquake is a notable exception, with extraordinary losses due to tsunami. The potential for secondary hazards varies greatly, and systematically, due to regional geologic and geomorphic conditions. Based on our findings, we have built country-specific disclaimers for PAGER that address potential for each hazard (Earle et al., Proceedings of the 14th World Conference of the Earthquake Engineering, Beijing, China, 2008). We will now focus on ways to model casualties from secondary effects based on their relative importance as well as their general predictability. ?? Springer Science+Business Media B.V. 2009.
ENGINEERING APPLICATIONS OF ANALOG COMPUTERS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bryant, L.T.; Janicke, M.J.; Just, L.C.
1961-02-01
Six examples are given of the application of analog computers in the fields of reactor engineering, heat transfer, and dynamics: deceleration of a reactor control rod by dashpot, pressure variations through a packed bed, reactor kinetics over many decades with thermal feedback (simulation of a TREAT transient), vibrating system with two degrees of freedom, temperature distribution in a radiating fin, and temperature distribution in an irfinite slab with variable thermal properties. (D.L.C.)
Loss Estimations due to Earthquakes and Secondary Technological Hazards
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2009-04-01
Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.
Quantitative Earthquake Prediction on Global and Regional Scales
NASA Astrophysics Data System (ADS)
Kossobokov, Vladimir G.
2006-03-01
for mega-earthquakes of M9.0+. The monitoring at regional scales may require application of a recently proposed scheme for the spatial stabilization of the intermediate-term middle-range predictions. The scheme guarantees a more objective and reliable diagnosis of times of increased probability and is less restrictive to input seismic data. It makes feasible reestablishment of seismic monitoring aimed at prediction of large magnitude earthquakes in Caucasus and Central Asia, which to our regret, has been discontinued in 1991. The first results of the monitoring (1986-1990) were encouraging, at least for M6.5+.
NASA Astrophysics Data System (ADS)
Wang, Yan; Lu, Yi
2018-01-01
The objective of this study is to explore the relationship between earthquake exposure and the incidence of PTSD. A stratification random sample survey was conducted to collect data in the Longmenshan thrust fault after Lushan earthquake three years. We used the Children's Revised Impact of Event Scale (CRIES-13) and the Earthquake Experience Scale. Subjects in this study included 3944 school student survivors in local eleven schools. The prevalence of probable PTSD is relatively higher, when the people was trapped in the earthquake, was injured in the earthquake or have relatives who died in the earthquake. It concluded that researchers need to pay more attention to the children and adolescents. The government should pay more attention to these people and provide more economic support.
Testing for scale-invariance in extreme events, with application to earthquake occurrence
NASA Astrophysics Data System (ADS)
Main, I.; Naylor, M.; Greenhough, J.; Touati, S.; Bell, A.; McCloskey, J.
2009-04-01
We address the generic problem of testing for scale-invariance in extreme events, i.e. are the biggest events in a population simply a scaled model of those of smaller size, or are they in some way different? Are large earthquakes for example ‘characteristic', do they ‘know' how big they will be before the event nucleates, or is the size of the event determined only in the avalanche-like process of rupture? In either case what are the implications for estimates of time-dependent seismic hazard? One way of testing for departures from scale invariance is to examine the frequency-size statistics, commonly used as a bench mark in a number of applications in Earth and Environmental sciences. Using frequency data however introduces a number of problems in data analysis. The inevitably small number of data points for extreme events and more generally the non-Gaussian statistical properties strongly affect the validity of prior assumptions about the nature of uncertainties in the data. The simple use of traditional least squares (still common in the literature) introduces an inherent bias to the best fit result. We show first that the sampled frequency in finite real and synthetic data sets (the latter based on the Epidemic-Type Aftershock Sequence model) converge to a central limit only very slowly due to temporal correlations in the data. A specific correction for temporal correlations enables an estimate of convergence properties to be mapped non-linearly on to a Gaussian one. Uncertainties closely follow a Poisson distribution of errors across the whole range of seismic moment for typical catalogue sizes. In this sense the confidence limits are scale-invariant. A systematic sample bias effect due to counting whole numbers in a finite catalogue makes a ‘characteristic'-looking type extreme event distribution a likely outcome of an underlying scale-invariant probability distribution. This highlights the tendency of ‘eyeball' fits unconsciously (but wrongly in
Crowdsourced earthquake early warning.
Minson, Sarah E; Brooks, Benjamin A; Glennie, Craig L; Murray, Jessica R; Langbein, John O; Owen, Susan E; Heaton, Thomas H; Iannucci, Robert A; Hauser, Darren L
2015-04-01
Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an M w (moment magnitude) 7 earthquake on California's Hayward fault, and real data from the M w 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.
Crowdsourced earthquake early warning
Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.
2015-01-01
Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing. PMID:26601167
Earthquakes, November-December 1992
Person, W.J.
1993-01-01
There were two major earthquakes (7.0≤M<8.0) during the last two months of the year, a magntidue 7.5 earthquake on December 12 in the Flores region, Indonesia, and a magnitude 7.0 earthquake on December 20 in the Banda Sea. Earthquakes caused fatalities in China and Indonesia. The greatest number of deaths (2,500) for the year occurred in Indonesia. In Switzerland, six people were killed by an accidental explosion recoreded by seismographs. In teh United States, a magnitude 5.3 earthquake caused slight damage at Big Bear in southern California.
Crowdsourced earthquake early warning
Minson, Sarah E.; Brooks, Benjamin A.; Glennie, Craig L.; Murray, Jessica R.; Langbein, John O.; Owen, Susan E.; Heaton, Thomas H.; Iannucci, Robert A.; Hauser, Darren L.
2015-01-01
Earthquake early warning (EEW) can reduce harm to people and infrastructure from earthquakes and tsunamis, but it has not been implemented in most high earthquake-risk regions because of prohibitive cost. Common consumer devices such as smartphones contain low-cost versions of the sensors used in EEW. Although less accurate than scientific-grade instruments, these sensors are globally ubiquitous. Through controlled tests of consumer devices, simulation of an Mw (moment magnitude) 7 earthquake on California’s Hayward fault, and real data from the Mw 9 Tohoku-oki earthquake, we demonstrate that EEW could be achieved via crowdsourcing.
NASA Astrophysics Data System (ADS)
Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor
2017-04-01
Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located
NASA Astrophysics Data System (ADS)
Cydzik, K.; Hamilton, D.; Stenner, H. D.; Cattarossi, A.; Shrestha, P. L.
2009-12-01
The May 12, 2008 M7.9 Wenchuan Earthquake in Sichuan Province, China killed almost 90,000 people and affected a population of over 45.5 million throughout western China. Shaking caused the destruction of five million buildings, many of them homes and schools, and damaged 21 million other structures, inflicting devastating impacts to communities. Landslides, a secondary effect of the shaking, caused much of the devastation. Debris flows buried schools and homes, rock falls crushed cars, and rockslides, landslides, and rock avalanches blocked streams and rivers creating massive, unstable landslide dams, which formed “quake lakes” upstream of the blockages. Impassable roads made emergency access slow and extremely difficult. Collapses of buildings and structures large and small took the lives of many. Damage to infrastructure impaired communication, cut off water supplies and electricity, and put authorities on high alert as the integrity of large engineered dams were reviewed. During our field reconnaissance three months after the disaster, evidence of the extent of the tragedy was undeniably apparent. Observing the damage throughout Sichuan reminded us that earthquakes in the United States and throughout the world routinely cause widespread damage and destruction to lives, property, and infrastructure. The focus of this poster is to present observations and findings based on our field reconnaissance regarding the scale of earthquake destruction with respect to slope failures, landslide dams, damage to infrastructure (e.g., schools, engineered dams, buildings, roads, rail lines, and water resources facilities), human habitation within the region, and the mitigation and response effort to this catastrophe. This is presented in the context of the policy measures that could be developed to reduce risks of similar catastrophes. The rapid response of the Chinese government and the mobilization of the Chinese People’s Liberation Army to help the communities affected
ERIC Educational Resources Information Center
Haddad, David Elias
2014-01-01
Earth's topographic surface forms an interface across which the geodynamic and geomorphic engines interact. This interaction is best observed along crustal margins where topography is created by active faulting and sculpted by geomorphic processes. Crustal deformation manifests as earthquakes at centennial to millennial timescales. Given that…
Supercritical and Transcritical Real-Fluid Mixing in Diesel Engine Applications
2015-09-01
ARL-RP-0551 ● SEP 2015 US Army Research Laboratory Supercritical and Transcritical Real-Fluid Mixing in Diesel Engine...ARL-RP-0551 ● SEP 2015 US Army Research Laboratory Supercritical and Transcritical Real-Fluid Mixing in Diesel Engine Applications by...COVERED (From - To) 1 January 2014–30 September 2014 4. TITLE AND SUBTITLE Supercritical and Transcritical Real-Fluid Mixing in Diesel Engine
NASA Astrophysics Data System (ADS)
Sun, Y.; Luo, G.
2017-12-01
Seismicity in a region is usually characterized by earthquake clusters and earthquake migration along its major fault zones. However, we do not fully understand why and how earthquake clusters and spatio-temporal migration of earthquakes occur. The northeastern Tibetan Plateau is a good example for us to investigate these problems. In this study, we construct and use a three-dimensional viscoelastoplastic finite-element model to simulate earthquake cycles and spatio-temporal migration of earthquakes along major fault zones in northeastern Tibetan Plateau. We calculate stress evolution and fault interactions, and explore effects of topographic loading and viscosity of middle-lower crust and upper mantle on model results. Model results show that earthquakes and fault interactions increase Coulomb stress on the neighboring faults or segments, accelerating the future earthquakes in this region. Thus, earthquakes occur sequentially in a short time, leading to regional earthquake clusters. Through long-term evolution, stresses on some seismogenic faults, which are far apart, may almost simultaneously reach the critical state of fault failure, probably also leading to regional earthquake clusters and earthquake migration. Based on our model synthetic seismic catalog and paleoseismic data, we analyze probability of earthquake migration between major faults in northeastern Tibetan Plateau. We find that following the 1920 M 8.5 Haiyuan earthquake and the 1927 M 8.0 Gulang earthquake, the next big event (M≥7) in northeastern Tibetan Plateau would be most likely to occur on the Haiyuan fault.
Chapter F. The Loma Prieta, California, Earthquake of October 17, 1989 - Marina District
O'Rourke, Thomas D.
1992-01-01
During the earthquake, a total land area of about 4,300 km2 was shaken with seismic intensities that can cause significant damage to structures. The area of the Marina District of San Francisco is only 4.0 km2--less than 0.1 percent of the area most strongly affected by the earthquake--but its significance with respect to engineering, seismology, and planning far outstrips its proportion of shaken terrain and makes it a centerpiece for lessons learned from the earthquake. The Marina District provides perhaps the most comprehensive case history of seismic effects at a specific site developed for any earthquake. The reports assembled in this chapter, which provide an account of these seismic effects, constitute a unique collection of studies on site, as well as infrastructure and societal, response that cover virtually all aspects of the earthquake, ranging from incoming ground waves to the outgoing airwaves used for emergency communication. The Marina District encompasses the area bounded by San Francisco Bay on the north, the Presidio on the west, and Lombard Street and Van Ness Avenue on the south and east, respectively. Nearly all of the earthquake damage in the Marina District, however, occurred within a considerably smaller area of about 0.75 km2, bounded by San Francisco Bay and Baker, Chestnut, and Buchanan Streets. At least five major aspects of earthquake response in the Marina District are covered by the reports in this chapter: (1) dynamic site response, (2) soil liquefaction, (3) lifeline performance, (4) building performance, and (5) emergency services.
Prototype operational earthquake prediction system
Spall, Henry
1986-01-01
An objective if the U.S. Earthquake Hazards Reduction Act of 1977 is to introduce into all regions of the country that are subject to large and moderate earthquakes, systems for predicting earthquakes and assessing earthquake risk. In 1985, the USGS developed for the Secretary of the Interior a program for implementation of a prototype operational earthquake prediction system in southern California.
Perception of earthquake risk in Taiwan: effects of gender and past earthquake experience.
Kung, Yi-Wen; Chen, Sue-Huei
2012-09-01
This study explored how individuals in Taiwan perceive the risk of earthquake and the relationship of past earthquake experience and gender to risk perception. Participants (n= 1,405), including earthquake survivors and those in the general population without prior direct earthquake exposure, were selected and interviewed through a computer-assisted telephone interviewing procedure using a random sampling and stratification method covering all 24 regions of Taiwan. A factor analysis of the interview data yielded a two-factor structure of risk perception in regard to earthquake. The first factor, "personal impact," encompassed perception of threat and fear related to earthquakes. The second factor, "controllability," encompassed a sense of efficacy of self-protection in regard to earthquakes. The findings indicated prior earthquake survivors and females reported higher scores on the personal impact factor than males and those with no prior direct earthquake experience, although there were no group differences on the controllability factor. The findings support that risk perception has multiple components, and suggest that past experience (survivor status) and gender (female) affect the perception of risk. Exploration of potential contributions of other demographic factors such as age, education, and marital status to personal impact, especially for females and survivors, is discussed. Future research on and intervention program with regard to risk perception are suggested accordingly. © 2012 Society for Risk Analysis.
Kirby, Stephen; Scholl, David; von Huene, Roland E.; Wells, Ray
2013-01-01
Tsunami modeling has shown that tsunami sources located along the Alaska Peninsula segment of the Aleutian-Alaska subduction zone have the greatest impacts on southern California shorelines by raising the highest tsunami waves for a given source seismic moment. The most probable sector for a Mw ~ 9 source within this subduction segment is between Kodiak Island and the Shumagin Islands in what we call the Semidi subduction sector; these bounds represent the southwestern limit of the 1964 Mw 9.2 Alaska earthquake rupture and the northeastern edge of the Shumagin sector that recent Global Positioning System (GPS) observations indicate is currently creeping. Geological and geophysical features in the Semidi sector that are thought to be relevant to the potential for large magnitude, long-rupture-runout interplate thrust earthquakes are remarkably similar to those in northeastern Japan, where the destructive Mw 9.1 tsunamigenic earthquake of 11 March 2011 occurred. In this report we propose and justify the selection of a tsunami source seaward of the Alaska Peninsula for use in the Tsunami Scenario that is part of the U.S. Geological Survey (USGS) Science Application for Risk Reduction (SAFRR) Project. This tsunami source should have the potential to raise damaging tsunami waves on the California coast, especially at the ports of Los Angeles and Long Beach. Accordingly, we have summarized and abstracted slip distribution from the source literature on the 2011 event, the best characterized for any subduction earthquake, and applied this synoptic slip distribution to the similar megathrust geometry of the Semidi sector. The resulting slip model has an average slip of 18.6 m and a moment magnitude of Mw = 9.1. The 2011 Tohoku earthquake was not anticipated, despite Japan having the best seismic and geodetic networks in the world and the best historical record in the world over the past 1,500 years. What was lacking was adequate paleogeologic data on prehistoric earthquakes
Haeussler, Peter J.; Plafker, George
1995-01-01
Earthquake risk is high in much of the southern half of Alaska, but it is not the same everywhere. This map shows the overall geologic setting in Alaska that produces earthquakes. The Pacific plate (darker blue) is sliding northwestward past southeastern Alaska and then dives beneath the North American plate (light blue, green, and brown) in southern Alaska, the Alaska Peninsula, and the Aleutian Islands. Most earthquakes are produced where these two plates come into contact and slide past each other. Major earthquakes also occur throughout much of interior Alaska as a result of collision of a piece of crust with the southern margin.
Ergodicity in natural earthquake fault networks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tiampo, K. F.; Rundle, J. B.; Holliday, J.
2007-06-15
Numerical simulations have shown that certain driven nonlinear systems can be characterized by mean-field statistical properties often associated with ergodic dynamics [C. D. Ferguson, W. Klein, and J. B. Rundle, Phys. Rev. E 60, 1359 (1999); D. Egolf, Science 287, 101 (2000)]. These driven mean-field threshold systems feature long-range interactions and can be treated as equilibriumlike systems with statistically stationary dynamics over long time intervals. Recently the equilibrium property of ergodicity was identified in an earthquake fault system, a natural driven threshold system, by means of the Thirumalai-Mountain (TM) fluctuation metric developed in the study of diffusive systems [K. F.more » Tiampo, J. B. Rundle, W. Klein, J. S. Sa Martins, and C. D. Ferguson, Phys. Rev. Lett. 91, 238501 (2003)]. We analyze the seismicity of three naturally occurring earthquake fault networks from a variety of tectonic settings in an attempt to investigate the range of applicability of effective ergodicity, using the TM metric and other related statistics. Results suggest that, once variations in the catalog data resulting from technical and network issues are accounted for, all of these natural earthquake systems display stationary periods of metastable equilibrium and effective ergodicity that are disrupted by large events. We conclude that a constant rate of events is an important prerequisite for these periods of punctuated ergodicity and that, while the level of temporal variability in the spatial statistics is the controlling factor in the ergodic behavior of seismic networks, no single statistic is sufficient to ensure quantification of ergodicity. Ergodicity in this application not only requires that the system be stationary for these networks at the applicable spatial and temporal scales, but also implies that they are in a state of metastable equilibrium, one in which the ensemble averages can be substituted for temporal averages in studying their
NASA Astrophysics Data System (ADS)
Afifuddin, M.; Panjaitan, M. A. R.; Ayuna, D.
2017-02-01
Earthquakes are one of the most dangerous, destructive and unpredictable natural hazards, which can leave everything up to a few hundred kilometres in complete destruction in seconds. Indonesia has a unique position as an earthquake prone country. It is the place of the interaction for three tectonic plates, namely the Indo-Australian, Eurasian and Pacific plates. Banda Aceh is one of the cities that located in earthquake-prone areas. Due to the vulnerable conditions of Banda Aceh some efforts have been exerted to reduce these unfavourable conditions. Many aspects have been addressed, starting from community awareness up to engineering solutions. One of them is all buildings that build in the city should be designed as an earthquake resistant building. The objectives of this research are to observe the response of a reinforced concrete structure due to several types of earthquake load, and to see the performance of the structure after earthquake loads applied. After Tsunami in 2004 many building has been build, one of them is a hotel building located at simpang lima. The hotel is made of reinforced concrete with a height of 34.95 meters with a total area of 8872.5 m2 building. So far this building was the tallest building in Banda Aceh.
LIDAR Investigation Of The 2004 Niigata Ken Chuetsu, Japan, Earthquake
NASA Astrophysics Data System (ADS)
Kayen, R.; Pack, R. T.; Sugimoto, S.; Tanaka, H.
2005-12-01
The 23 October 2004 Niigata Ken Chuetsu, Japan, Mw 6.6 earthquake was the most significant earthquake to affect Japan since the 1995 Kobe earthquake. Forty people were killed, almost 3,000 injured, and numerous landslides destroyed entire upland villages. Landslides and permanent ground deformation caused extensive damage to roads, rail lines and other lifelines, resulting in major economic disruption. The cities and towns most significantly affected by the earthquake were Nagaoka, Ojiya, and the mountainous rural areas of Yamakoshi village and Kawaguchi town. Our EERI team traveled with a tripod mounted LIDAR (Light Detection and Ranging) unit, a scanning-laser that creates ultra high-resolution 3-D digital terrain models of the earthquake damaged surfaces the ground, structures, and life-lines. This new technology allows for rapid and remote sensing of damaged terrain. Ground-based LIDAR has an accuracy range of 0.5-2.5 cm, and can illuminate targets up to 400m away from the sensor. During a single tripod-mounted LIDAR scan of 10 minutes, several million survey points are collected and processed into an ultra-high resolution terrain model of the damaged ground or structure. There are several benefits in acquiring these LIDAR data in the initial reconnaissance effort after the earthquake. First, we record the detailed failure morphologies of damaged ground and structures in order to make measurements that are either impractical or impossible by conventional survey means. The digital terrain models allow us to enlarge, enhance and rotate data in order to visualize damage in orientations and scales not previously possible. This ability to visualize damage allows us to better understand failure modes. Finally, LIDAR allows us to archive 3-D terrain models so that the engineering community can evaluate analytical and numerical models of deformation potential against detailed field measurements. Here, we discuss the findings of this 2004 Niigata Chuetsu Earthquake (M6
NASA Astrophysics Data System (ADS)
Mourhatch, Ramses
the 30-year probability of occurrence of the San Andreas scenario earthquakes using the PEER performance based earthquake engineering framework to determine the probability of exceedance of these limit states over the next 30 years.
Earthquakes, November-December 1973
Person, W.J.
1974-01-01
Other parts of the world suffered fatalities and significant damage from earthquakes. In Iran, an earthquake killed one person, injured many, and destroyed a number of homes. Earthquake fatalities also occurred in the Azores and in Algeria.
Chu, Zhi-gang; Yang, Zhi-gang; Dong, Zhi-hui; Chen, Tian-wu; Zhu, Zhi-yu; Shao, Heng
2011-01-01
OBJECTIVE: The features of earthquake-related head injuries may be different from those of injuries obtained in daily life because of differences in circumstances. We aim to compare the features of head traumas caused by the Sichuan earthquake with those of other common head traumas using multidetector computed tomography. METHODS: In total, 221 patients with earthquake-related head traumas (the earthquake group) and 221 patients with other common head traumas (the non-earthquake group) were enrolled in our study, and their computed tomographic findings were compared. We focused the differences between fractures and intracranial injuries and the relationships between extracranial and intracranial injuries. RESULTS: More earthquake-related cases had only extracranial soft tissue injuries (50.7% vs. 26.2%, RR = 1.9), and fewer cases had intracranial injuries (17.2% vs. 50.7%, RR = 0.3) compared with the non-earthquake group. For patients with fractures and intracranial injuries, there were fewer cases with craniocerebral injuries in the earthquake group (60.6% vs. 77.9%, RR = 0.8), and the earthquake-injured patients had fewer fractures and intracranial injuries overall (1.5±0.9 vs. 2.5±1.8; 1.3±0.5 vs. 2.1±1.1). Compared with the non-earthquake group, the incidences of soft tissue injuries and cranial fractures combined with intracranial injuries in the earthquake group were significantly lower (9.8% vs. 43.7%, RR = 0.2; 35.1% vs. 82.2%, RR = 0.4). CONCLUSION: As depicted with computed tomography, the severity of earthquake-related head traumas in survivors was milder, and isolated extracranial injuries were more common in earthquake-related head traumas than in non-earthquake-related injuries, which may have been the result of different injury causes, mechanisms and settings. PMID:22012045
[Flexible print circuit technology application in biomedical engineering].
Jiang, Lihua; Cao, Yi; Zheng, Xiaolin
2013-06-01
Flexible print circuit (FPC) technology has been widely applied in variety of electric circuits with high precision due to its advantages, such as low-cost, high specific fabrication ability, and good flexibility, etc. Recently, this technology has also been used in biomedical engineering, especially in the development of microfluidic chip and microelectrode array. The high specific fabrication can help making microelectrode and other micro-structure equipment. And good flexibility allows the micro devices based on FPC technique to be easily packaged with other parts. In addition, it also reduces the damage of microelectrodes to the tissue. In this paper, the application of FPC technology in biomedical engineering is introduced. Moreover, the important parameters of FPC technique and the development trend of prosperous applications is also discussed.
Person, W.J.
1992-01-01
In the United States, a magnitude 5.8 earthquake in southern California on June 28 killed two people and caused considerable damage. Strong earthquakes hit Alaska on May 1 and May 30; the May 1 earthquake caused some minor damage.
Application of physics engines in virtual worlds
NASA Astrophysics Data System (ADS)
Norman, Mark; Taylor, Tim
2002-03-01
Dynamic virtual worlds potentially can provide a much richer and more enjoyable experience than static ones. To realize such worlds, three approaches are commonly used. The first of these, and still widely applied, involves importing traditional animations from a modeling system such as 3D Studio Max. This approach is therefore limited to predefined animation scripts or combinations/blends thereof. The second approach involves the integration of some specific-purpose simulation code, such as car dynamics, and is thus generally limited to one (class of) application(s). The third approach involves the use of general-purpose physics engines, which promise to enable a range of compelling dynamic virtual worlds and to considerably speed up development. By far the largest market today for real-time simulation is computer games, revenues exceeding those of the movie industry. Traditionally, the simulation is produced by game developers in-house for specific titles. However, off-the-shelf middleware physics engines are now available for use in games and related domains. In this paper, we report on our experiences of using middleware physics engines to create a virtual world as an interactive experience, and an advanced scenario where artificial life techniques generate controllers for physically modeled characters.
Using hybrid expert system approaches for engineering applications
NASA Technical Reports Server (NTRS)
Allen, R. H.; Boarnet, M. G.; Culbert, C. J.; Savely, R. T.
1987-01-01
In this paper, the use of hybrid expert system shells and hybrid (i.e., algorithmic and heuristic) approaches for solving engineering problems is reported. Aspects of various engineering problem domains are reviewed for a number of examples with specific applications made to recently developed prototype expert systems. Based on this prototyping experience, critical evaluations of and comparisons between commercially available tools, and some research tools, in the United States and Australia, and their underlying problem-solving paradigms are made. Characteristics of the implementation tool and the engineering domain are compared and practical software engineering issues are discussed with respect to hybrid tools and approaches. Finally, guidelines are offered with the hope that expert system development will be less time consuming, more effective, and more cost-effective than it has been in the past.
Earthquake forecasting studies using radon time series data in Taiwan
NASA Astrophysics Data System (ADS)
Walia, Vivek; Kumar, Arvind; Fu, Ching-Chou; Lin, Shih-Jung; Chou, Kuang-Wu; Wen, Kuo-Liang; Chen, Cheng-Hong
2017-04-01
For few decades, growing number of studies have shown usefulness of data in the field of seismogeochemistry interpreted as geochemical precursory signals for impending earthquakes and radon is idendified to be as one of the most reliable geochemical precursor. Radon is recognized as short-term precursor and is being monitored in many countries. This study is aimed at developing an effective earthquake forecasting system by inspecting long term radon time series data. The data is obtained from a network of radon monitoring stations eastblished along different faults of Taiwan. The continuous time series radon data for earthquake studies have been recorded and some significant variations associated with strong earthquakes have been observed. The data is also examined to evaluate earthquake precursory signals against environmental factors. An automated real-time database operating system has been developed recently to improve the data processing for earthquake precursory studies. In addition, the study is aimed at the appraisal and filtrations of these environmental parameters, in order to create a real-time database that helps our earthquake precursory study. In recent years, automatic operating real-time database has been developed using R, an open source programming language, to carry out statistical computation on the data. To integrate our data with our working procedure, we use the popular and famous open source web application solution, AMP (Apache, MySQL, and PHP), creating a website that could effectively show and help us manage the real-time database.
Foil Bearing Starting Considerations and Requirements for Rotorcraft Engine Applications
NASA Technical Reports Server (NTRS)
Radil, Kevin C.; DellaCorte, Christopher
2009-01-01
Foil gas bearings under development for rotorcraft-sized, hot core engine applications have been susceptible to damage from the slow acceleration and rates typically encountered during the pre-ignition stage in conventional engines. Recent laboratory failures have been assumed to be directly linked to operating foil bearings below their lift-off speed while following conventional startup procedures for the engines. In each instance, the continuous sliding contact between the foils and shaft was believed to thermally overload the bearing and cause the engines to fail. These failures highlight the need to characterize required acceleration rates and minimum operating speeds for these applications. In this report, startup experiments were conducted with a large, rotorcraft engine sized foil bearing under moderate load and acceleration rates to identify the proper start procedures needed to avoid bearing failure. The results showed that a bearing under a 39.4 kPa static load can withstand a modest acceleration rate of 500 rpm/s and excessive loitering below the bearing lift-off speed provided an adequate solid lubricant is present.
Synthetic earthquake catalogs simulating seismic activity in the Corinth Gulf, Greece, fault system
NASA Astrophysics Data System (ADS)
Console, Rodolfo; Carluccio, Roberto; Papadimitriou, Eleftheria; Karakostas, Vassilis
2015-01-01
The characteristic earthquake hypothesis is the basis of time-dependent modeling of earthquake recurrence on major faults. However, the characteristic earthquake hypothesis is not strongly supported by observational data. Few fault segments have long historical or paleoseismic records of individually dated ruptures, and when data and parameter uncertainties are allowed for, the form of the recurrence distribution is difficult to establish. This is the case, for instance, of the Corinth Gulf Fault System (CGFS), for which documents about strong earthquakes exist for at least 2000 years, although they can be considered complete for M ≥ 6.0 only for the latest 300 years, during which only few characteristic earthquakes are reported for individual fault segments. The use of a physics-based earthquake simulator has allowed the production of catalogs lasting 100,000 years and containing more than 500,000 events of magnitudes ≥ 4.0. The main features of our simulation algorithm are (1) an average slip rate released by earthquakes for every single segment in the investigated fault system, (2) heuristic procedures for rupture growth and stop, leading to a self-organized earthquake magnitude distribution, (3) the interaction between earthquake sources, and (4) the effect of minor earthquakes in redistributing stress. The application of our simulation algorithm to the CGFS has shown realistic features in time, space, and magnitude behavior of the seismicity. These features include long-term periodicity of strong earthquakes, short-term clustering of both strong and smaller events, and a realistic earthquake magnitude distribution departing from the Gutenberg-Richter distribution in the higher-magnitude range.
Lessons of L'Aquila for Operational Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Jordan, T. H.
2012-12-01
The L'Aquila earthquake of 6 Apr 2009 (magnitude 6.3) killed 309 people and left tens of thousands homeless. The mainshock was preceded by a vigorous seismic sequence that prompted informal earthquake predictions and evacuations. In an attempt to calm the population, the Italian Department of Civil Protection (DPC) convened its Commission on the Forecasting and Prevention of Major Risk (MRC) in L'Aquila on 31 March 2009 and issued statements about the hazard that were widely received as an "anti-alarm"; i.e., a deterministic prediction that there would not be a major earthquake. On October 23, 2012, a court in L'Aquila convicted the vice-director of DPC and six scientists and engineers who attended the MRC meeting on charges of criminal manslaughter, and it sentenced each to six years in prison. A few weeks after the L'Aquila disaster, the Italian government convened an International Commission on Earthquake Forecasting for Civil Protection (ICEF) with the mandate to assess the status of short-term forecasting methods and to recommend how they should be used in civil protection. The ICEF, which I chaired, issued its findings and recommendations on 2 Oct 2009 and published its final report, "Operational Earthquake Forecasting: Status of Knowledge and Guidelines for Implementation," in Aug 2011 (www.annalsofgeophysics.eu/index.php/annals/article/view/5350). As defined by the Commission, operational earthquake forecasting (OEF) involves two key activities: the continual updating of authoritative information about the future occurrence of potentially damaging earthquakes, and the officially sanctioned dissemination of this information to enhance earthquake preparedness in threatened communities. Among the main lessons of L'Aquila is the need to separate the role of science advisors, whose job is to provide objective information about natural hazards, from that of civil decision-makers who must weigh the benefits of protective actions against the costs of false alarms
Earthquake likelihood model testing
Schorlemmer, D.; Gerstenberger, M.C.; Wiemer, S.; Jackson, D.D.; Rhoades, D.A.
2007-01-01
INTRODUCTIONThe Regional Earthquake Likelihood Models (RELM) project aims to produce and evaluate alternate models of earthquake potential (probability per unit volume, magnitude, and time) for California. Based on differing assumptions, these models are produced to test the validity of their assumptions and to explore which models should be incorporated in seismic hazard and risk evaluation. Tests based on physical and geological criteria are useful but we focus on statistical methods using future earthquake catalog data only. We envision two evaluations: a test of consistency with observed data and a comparison of all pairs of models for relative consistency. Both tests are based on the likelihood method, and both are fully prospective (i.e., the models are not adjusted to fit the test data). To be tested, each model must assign a probability to any possible event within a specified region of space, time, and magnitude. For our tests the models must use a common format: earthquake rates in specified “bins” with location, magnitude, time, and focal mechanism limits.Seismology cannot yet deterministically predict individual earthquakes; however, it should seek the best possible models for forecasting earthquake occurrence. This paper describes the statistical rules of an experiment to examine and test earthquake forecasts. The primary purposes of the tests described below are to evaluate physical models for earthquakes, assure that source models used in seismic hazard and risk studies are consistent with earthquake data, and provide quantitative measures by which models can be assigned weights in a consensus model or be judged as suitable for particular regions.In this paper we develop a statistical method for testing earthquake likelihood models. A companion paper (Schorlemmer and Gerstenberger 2007, this issue) discusses the actual implementation of these tests in the framework of the RELM initiative.Statistical testing of hypotheses is a common task and a
76 FR 37085 - Applications for New Awards; Rehabilitation Engineering Research Centers (RERCs)
Federal Register 2010, 2011, 2012, 2013, 2014
2011-06-24
... DEPARTMENT OF EDUCATION Applications for New Awards; Rehabilitation Engineering Research Centers...)--Disability and Rehabilitation Research Projects and Centers Program--Rehabilitation Engineering Research... (Rehabilitation Act). Rehabilitation Engineering Research Centers Program (RERCs) The purpose of the RERC program...
Connecting NASA science and engineering with earth science applications
USDA-ARS?s Scientific Manuscript database
The National Research Council (NRC) recently highlighted the dual role of NASA to support both science and applications in planning Earth observations. This Editorial reports the efforts of the NASA Soil Moisture Active Passive (SMAP) mission to integrate applications with science and engineering i...
Earthquake behavior of steel cushion-implemented reinforced concrete frames
NASA Astrophysics Data System (ADS)
Özkaynak, Hasan
2018-04-01
The earthquake performance of vulnerable structures can be increased by the implementation of supplementary energy-dissipative metallic elements. The main aim of this paper is to describe the earthquake behavior of steel cushion-implemented reinforced concrete frames (SCI-RCFR) in terms of displacement demands and energy components. Several quasi-static experiments were performed on steel cushions (SC) installed in reinforced concrete (RC) frames. The test results served as the basis of the analytical models of SCs and a bare reinforced concrete frame (B-RCFR). These models were integrated in order to obtain the resulting analytical model of the SCI-RCFR. Nonlinear-time history analyses (NTHA) were performed on the SCI-RCFR under the effects of the selected earthquake data set. According to the NTHA, SC application is an effective technique for increasing the seismic performance of RC structures. The main portion of the earthquake input energy was dissipated through SCs. SCs succeeded in decreasing the plastic energy demand on structural elements by almost 50% at distinct drift levels.