Science.gov

Sample records for seismic vulnerability analysis

  1. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Vector-intensity measure based seismic vulnerability analysis of bridge structures

    NASA Astrophysics Data System (ADS)

    Li, Zhongxian; Li, Yang; Li, Ning

    2014-12-01

    This paper presents a method for seismic vulnerability analysis of bridge structures based on vector-valued intensity measure (vIM), which predicts the limit-state capacities efficiently with multi-intensity measures of seismic event. Accounting for the uncertainties of the bridge model, ten single-bent overpass bridge structures are taken as samples statistically using Latin hypercube sampling approach. 200 earthquake records are chosen randomly for the uncertainties of ground motions according to the site condition of the bridges. The uncertainties of structural capacity and seismic demand are evaluated with the ratios of demand to capacity in different damage state. By comparing the relative importance of different intensity measures, S a ( T 1) and S a ( T 2) are chosen as vIM. Then, the vector-valued fragility functions of different bridge components are developed. Finally, the system-level vulnerability of the bridge based on vIM is studied with Dunnett-Sobel class correlation matrix which can consider the correlation effects of different bridge components. The study indicates that an increment IMs from a scalar IM to vIM results in a significant reduction in the dispersion of fragility functions and in the uncertainties in evaluating earthquake risk. The feasibility and validity of the proposed vulnerability analysis method is validated and the bridge is more vulnerable than any components.

  3. Integrating Social impacts on Health and Health-Care Systems in Systemic Seismic Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Kunz-Plapp, T.; Khazai, B.; Daniell, J. E.

    2012-04-01

    This paper presents a new method for modeling health impacts caused by earthquake damage which allows for integrating key social impacts on individual health and health-care systems and for implementing these impacts in quantitative systemic seismic vulnerability analysis. In current earthquake casualty estimation models, demand on health-care systems is estimated by quantifying the number of fatalities and severity of injuries based on empirical data correlating building damage with casualties. The expected number of injured people (sorted by priorities of emergency treatment) is combined together with post-earthquake reduction of functionality of health-care facilities such as hospitals to estimate the impact on healthcare systems. The aim here is to extend these models by developing a combined engineering and social science approach. Although social vulnerability is recognized as a key component for the consequences of disasters, social vulnerability as such, is seldom linked to common formal and quantitative seismic loss estimates of injured people which provide direct impact on emergency health care services. Yet, there is a consensus that factors which affect vulnerability and post-earthquake health of at-risk populations include demographic characteristics such as age, education, occupation and employment and that these factors can aggravate health impacts further. Similarly, there are different social influences on the performance of health care systems after an earthquake both on an individual as well as on an institutional level. To link social impacts of health and health-care services to a systemic seismic vulnerability analysis, a conceptual model of social impacts of earthquakes on health and the health care systems has been developed. We identified and tested appropriate social indicators for individual health impacts and for health care impacts based on literature research, using available European statistical data. The results will be used to

  4. GPR surveys for the characterization of foundation plinths within a seismic vulnerability analysis

    NASA Astrophysics Data System (ADS)

    De Domenico, Domenica; Teramo, Antonio; Campo, Davide

    2013-06-01

    We present the results of GPR surveys performed to identify the foundation plinths of 12 buildings of a school, whose presence is uncertain since the structural drawings were not available. Their effective characterization is an essential element within a study aimed at assessing the seismic vulnerability of the buildings, which are non-seismically designed structures, located in an area classified as a seismic zone after their construction. Through GPR profiles acquired by two 250 MHz antennas, both in reflection mode and in a WARR configuration, the actual geometry and depth of the building plinths were successfully identified, limiting the number of invasive tests necessary to validate the GPR data interpretation, thus enabling the choice of the most suitable sites that would not alter the serviceability of the structure. The collected data were also critically analysed with reference to local environmental noise that, if causing reflections superimposed on those of the subsoil, could undermine the success of the investigation. Due to the homogeneity of the ground, the processing and results relative to each pair of profiles carried out for all of these buildings is very similar, so the results concerning only two of them are reported.

  5. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  6. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic

  7. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  8. Evaluation Of The Seismic Vulnerability of Fortified Structures

    SciTech Connect

    Baratta, Alessandro; Corbi, Ileana; Coppari, Sandro

    2008-07-08

    In the paper a prompt method to evaluate the seismic vulnerability of an ancient structure has been applied to the seismic vulnerability of the fortified structures in Italy, having as basics the elaboration of rather gross information about the state, the consistency and the history of the considered population of fabrics. The procedure proves to be rather effective and able to produce reliable results, despite the poor initial data.

  9. Evaluation Of The Seismic Vulnerability of Fortified Structures

    NASA Astrophysics Data System (ADS)

    Baratta, Alessandro; Corbi, Ileana; Coppari, Sandro

    2008-07-01

    In the paper a prompt method to evaluate the seismic vulnerability of an ancient structure has been applied to the seismic vulnerability of the fortified structures in Italy, having as basics the elaboration of rather gross information about the state, the consistency and the history of the considered population of fabrics. The procedure proves to be rather effective and able to produce reliable results, despite the poor initial data.

  10. Seismic Vulnerability and Performance Level of confined brick walls

    SciTech Connect

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-08

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material.Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iran is located on the earthquake belt of Alpide.Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures.In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.

  11. Approaches of Seismic Vulnerability Assessments in Near Real Time Systems

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2014-05-01

    Data on seismic vulnerability of existing building stock and other elements at risk are rather important for near real time earthquake loss estimations by global systems. These data together with information on regional peculiarities of seismic intensity attenuation and other factors contribute greatly to the reliability of strong event consequences estimated in emergency mode. There are different approaches for vulnerability functions' development and the empirical one is most often used. It is based on analysis of engineering consequences of past strong events when well documented descriptions of damage to different building types and other elements at risk are available for the earthquake prone area under consideration. In the case such data do not exist the information from macroseismic scales may be used. Any approach of vulnerability functions' development requires the proper classification of buildings and structures under consideration. According to national and international building codes, as well as macroseismic scales different buildings' classifications exist. As a result the global systems, such as Extremum and PAGER, as well as GEM project make use of the non-unified information on building stock distribution worldwide. The paper addresses the issues of buildings' classification and city models in terms of these classifications. Distribution of different buildings types in Extremum and PAGER/GEM systems is analyzed for earthquake prone countries. The comparison of city models revealed significant differences which influence greatly earthquake loss estimations in emergency mode. The paper describes the practice of city models' development which make use of space images and web technology in social networks. It is proposed to use the G8 country (and other) initiatives related to open data and transparency aimed at improving building stock distribution and global population databases.

  12. Key geophysical indicators of seismic vulnerability in Kingston, Jamaica

    NASA Astrophysics Data System (ADS)

    Brown, L. A.; Hornbach, M. J.; Salazar, W.; Kennedy, M.

    2012-12-01

    Kingston, the major city and hub of all commercial and industrial activity in Jamaica, has a history of moderate seismic activity; however, two significant (>Mw 6) earthquakes (1692 and 1907) caused major devastation resulting in thousands of casualties. Both the 1692 and 1907 events also triggered widespread liquefaction and tsunamis within Kingston Harbor. Kingston remains vulnerable to these earthquakes today because the city sits on 200-m to 600-m thick alluvial fan deposits adjacent to the Enriquillo-Plantain Garden Fault—the same fault system that activated during the Haiti 2010 earthquake. Recent GPS results suggest the potential for a Mw 7-7.5 earthquake near Kingston along the Enriquillo- Plantain Garden fault Zone (EPGFZ), the dominant east-west trending fault through Jamaica. Whether active strands EPGFZ extend through downtown Kingston remains unclear, however, recent sonar mapping in Kingston harbor show evidence for active faulting, with offshore faults connecting to proposed active on-land fault systems that extend through populated areas of the city. Seismic "Chirp" reflections also shows evidence for multiple recent (Holocene) submarine slide deposits in the harbor that may be associated with historic tsunamis. Using recently acquired chirp and sediment cores, we are currently studying the recurrence interval of earthquake events. We also recently performed a microtremor survey to identify areas prone to earthquake-induced ground shaking throughout the city of Kingston & St. Andrew parish. Data was collected at 200 points with a lateral spacing of 500 metres between each point. Our analysis shows significant variations in the fundamental frequency across the city and results clearly indicate areas of potential amplification, with areas surrounding Kingston harbor (much of which has been built on reclaimed land) showing the highest potential for ground amplification. The microtremor analysis suggests several high-density urban areas as well as key

  13. Rapid Assessment of Seismic Vulnerability in Palestinian Refugee Camps

    NASA Astrophysics Data System (ADS)

    Al-Dabbeek, Jalal N.; El-Kelani, Radwan J.

    Studies of historical and recorded earthquakes in Palestine demonstrate that damaging earthquakes are occurring frequently along the Dead Sea Transform: Earthquake of 11 July 1927 (ML 6.2), Earthquake of 11 February 2004 (ML 5.2). In order to reduce seismic vulnerability of buildings, losses in lives, properties and infrastructures, an attempt was made to estimate the percentage of damage degrees and losses at selected refugee camps: Al Ama`ri, Balata and Dhaishe. Assessing the vulnerability classes of building structures was carried out according to the European Macro-Seismic Scale 1998 (EMS-98) and the Fedral Emergency Management Agency (FEMA). The rapid assessment results showed that very heavy structural and non structural damages will occur in the common buildings of the investigated Refugee Camps (many buildings will suffer from damages grades 4 and 5). Bad quality of buildings in terms of design and construction, lack of uniformity, absence of spaces between the building and the limited width of roads will definitely increase the seismic vulnerability under the influence of moderate-strong (M 6-7) earthquakes in the future.

  14. Seismic vulnerability and risk assessment of Kolkata City, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Devaraj, N.; Maiti, S. K.

    2015-06-01

    The city of Kolkata is one of the most urbanized and densely populated regions in the world and a major industrial and commercial hub of the eastern and northeastern region of India. In order to classify the seismic risk zones of Kolkata we used seismic hazard exposures on the vulnerability components, namely land use/land cover, population density, building typology, age and height. We microzoned seismic hazard of the city by integrating seismological, geological and geotechnical themes in GIS, which in turn are integrated with the vulnerability components in a logic-tree framework for the estimation of both the socioeconomic and structural risk of the city. In both the risk maps, three broad zones have been demarcated as "severe", "high" and "moderate". There had also been a risk-free zone in the city that is termed as "low". The damage distribution in the city due to the 1934 Bihar-Nepal earthquake of Mw = 8.1 matches satisfactorily well with the demarcated risk regime. The design horizontal seismic coefficients for the city have been worked out for all the fundamental periods that indicate suitability for "A", "B" and "C" type of structures. The cumulative damage probabilities in terms of "none", "slight", "moderate", "extensive" and "complete" have also been assessed for the predominantly four model building types viz. RM2L, RM2M, URML and URMM for each seismic structural risk zone in the city. Both the seismic hazard and risk maps are expected to play vital roles in the earthquake-inflicted disaster mitigation and management of the city of Kolkata.

  15. The influence of local mechanisms on large scale seismic vulnerability estimation of masonry building aggregates

    NASA Astrophysics Data System (ADS)

    Formisano, Antonio; Chieffo, Nicola; Milo, Bartolomeo; Fabbrocino, Francesco

    2016-12-01

    The current paper deals with the seismic vulnerability evaluation of masonry constructions grouped in aggregates through an "ad hoc" quick vulnerability form based on new assessment parameters considering local collapse mechanisms. First, a parametric kinematic analysis on masonry walls with different height (h) / thickness (t) ratios has been developed with the purpose of identifying the collapse load multiplier for activation of the main four first-order failure mechanisms. Subsequently, a form initially conceived for building aggregates suffering second-mode collapse mechanisms, has been expanded on the basis of the achieved results. Tre proposed quick vulnerability technique has been applied to one case study within the territory of Arsita (Teramo, Italy) and, finally, it has been also validated by the comparison of results with those deriving from application of the well-known FaMIVE procedure.

  16. Fault zone regulation, seismic hazard, and social vulnerability in Los Angeles, California: Hazard or urban amenity?

    NASA Astrophysics Data System (ADS)

    Toké, Nathan A.; Boone, Christopher G.; Arrowsmith, J. Ramón

    2014-09-01

    Public perception and regulation of environmental hazards are important factors in the development and configuration of cities. Throughout California, probabilistic seismic hazard mapping and geologic investigations of active faults have spatially quantified earthquake hazard. In Los Angeles, these analyses have informed earthquake engineering, public awareness, the insurance industry, and the government regulation of developments near faults. Understanding the impact of natural hazards regulation on the social and built geography of cities is vital for informing future science and policy directions. We constructed a relative social vulnerability index classification for Los Angeles to examine the social condition within regions of significant seismic hazard, including areas regulated as Alquist-Priolo (AP) Act earthquake fault zones. Despite hazard disclosures, social vulnerability is lowest within AP regulatory zones and vulnerability increases with distance from them. Because the AP Act requires building setbacks from active faults, newer developments in these zones are bisected by parks. Parcel-level analysis demonstrates that homes adjacent to these fault zone parks are the most valuable in their neighborhoods. At a broad scale, a Landsat-based normalized difference vegetation index shows that greenness near AP zones is greater than the rest of the metropolitan area. In the parks-poor city of Los Angeles, fault zone regulation has contributed to the construction of park space within areas of earthquake hazard, thus transforming zones of natural hazard into amenities, attracting populations of relatively high social status, and demonstrating that the distribution of social vulnerability is sometimes more strongly tied to amenities than hazards.

  17. Integrated Estimation of Seismic Physical Vulnerability of Tehran Using Rule Based Granular Computing

    NASA Astrophysics Data System (ADS)

    Sheikhian, H.; Delavar, M. R.; Stein, A.

    2015-08-01

    Tehran, the capital of Iran, is surrounded by the North Tehran fault, the Mosha fault and the Rey fault. This exposes the city to possibly huge earthquakes followed by dramatic human loss and physical damage, in particular as it contains a large number of non-standard constructions and aged buildings. Estimation of the likely consequences of an earthquake facilitates mitigation of these losses. Mitigation of the earthquake fatalities may be achieved by promoting awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures. In this research, granular computing using generality and absolute support for rule extraction is applied. It uses coverage and entropy for rule prioritization. These rules are combined to form a granule tree that shows the order and relation of the extracted rules. In this way the seismic physical vulnerability is assessed, integrating the effects of the three major known faults. Effective parameters considered in the physical seismic vulnerability assessment are slope, seismic intensity, height and age of the buildings. Experts were asked to predict seismic vulnerability for 100 randomly selected samples among more than 3000 statistical units in Tehran. The integrated experts' point of views serve as input into granular computing. Non-redundant covering rules preserve the consistency in the model, which resulted in 84% accuracy in the seismic vulnerability assessment based on the validation of the predicted test data against expected vulnerability degree. The study concluded that granular computing is a useful method to assess the effects of earthquakes in an earthquake prone area.

  18. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    SciTech Connect

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-08

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis.There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need.For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used.Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied.The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  19. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    NASA Astrophysics Data System (ADS)

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-01

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis. There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need. For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used. Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied. The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  20. Use of expert judgment elicitation to estimate seismic vulnerability of selected building types

    USGS Publications Warehouse

    Jaiswal, K.S.; Aspinall, W.; Perkins, D.; Wald, D.; Porter, K.A.

    2012-01-01

    Pooling engineering input on earthquake building vulnerability through an expert judgment elicitation process requires careful deliberation. This article provides an overview of expert judgment procedures including the Delphi approach and the Cooke performance-based method to estimate the seismic vulnerability of a building category.

  1. Seismic Vulnerability Assessment Rest House Building TA-16-41

    SciTech Connect

    Cuesta, Isabel; Salmon, Michael W.

    2003-10-01

    The purpose of this report is to present the results of the evaluation completed on the Rest House Facility (TA-16-4111) in support of hazard analysis for a Documented Safety Assessment (DSA). The Rest House facility has been evaluated to verify the structural response to seismic, wind, and snow loads in support of the DynEx DSA. The structural analyses consider the structure and the following systems and/or components inside the facility as requested by facility management: cranes, lighting protection system, and fire protection system. The facility has been assigned to Natural Phenomena Hazards (NPH) Performance Category (PC) –3. The facility structure was evaluated to PC-3 criteria because it serves to confine hazardous material, and in the event of an accident, the facility cannot fail or collapse. Seismicinduced failure of the cranes, lighting, and fire-protection systems according to DOE-STD-1021-93 (Ref. 1) “may result in adverse release consequences greater than safety-class Structures, Systems, and Components (SSC) Evaluation Guideline limits but much less than those associated with PC-4 SSC.” Therefore, these items will be evaluated to PC-3 criteria as well. This report presents the results of those analyses and suggests recommendations to improve the seismic capacity of the systems and components cited above.

  2. GIS-based seismic shaking slope vulnerability map of Sicily (Central Mediterranean)

    NASA Astrophysics Data System (ADS)

    Nigro, Fabrizio; Arisco, Giuseppe; Perricone, Marcella; Renda, Pietro; Favara, Rocco

    2010-05-01

    Earthquakes often represent very dangerouses natural events in terms of human life and economic losses and their damage effects are amplified by the synchronous occurrence of seismically-induced ground-shaking failures in wide regions around the seismogenic source. In fact, the shaking associated with big earthquakes triggers extensive landsliding, sometimes at distances of more than 100 km from the epicenter. The active tectonics and the geomorphic/morphodinamic pattern of the regions affected by earthquakes contribute to the slopes instability tendency. In fact, earthquake-induced groun-motion loading determines inertial forces activation within slopes that, combined with the intrinsic pre-existing static forces, reduces the slope stability towards its failure. Basically, under zero-shear stress reversals conditions, a catastrophic failure will take place if the earthquake-induced shear displacement exceeds the critical level of undrained shear strength to a value equal to the gravitational shear stress. However, seismic stability analyses carried out for various infinite slopes by using the existing Newmark-like methods reveal that estimated permanent displacements smaller than the critical value should also be regarded as dangerous for the post-earthquake slope safety, in terms of human activities use. Earthquake-induced (often high-speed) landslides are among the most destructive phenomena related to slopes failure during earthquakes. In fact, damage from earthquake-induced landslides (and other ground-failures), sometimes exceeds the buildings/infrastructures damage directly related to ground-shaking for fault breaking. For this matter, several hearthquakes-related slope failures methods have been developed, for the evaluation of the combined hazard types represented by seismically ground-motion landslides. The methodologies of analysis of the engineering seismic risk related to the slopes instability processes is often achieved through the evaluation of the

  3. Metadata for selecting or submitting generic seismic vulnerability functions via GEM's vulnerability database

    USGS Publications Warehouse

    Jaiswal, Kishor

    2013-01-01

    This memo lays out a procedure for the GEM software to offer an available vulnerability function for any acceptable set of attributes that the user specifies for a particular building category. The memo also provides general guidelines on how to submit the vulnerability or fragility functions to the GEM vulnerability repository, stipulating which attributes modelers must provide so that their vulnerability or fragility functions can be queried appropriately by the vulnerability database. An important objective is to provide users guidance on limitations and applicability by providing the associated modeling assumptions and applicability of each vulnerability or fragility function.

  4. Seismic vulnerability assessment of school buildings in Tehran city based on AHP and GIS

    NASA Astrophysics Data System (ADS)

    Panahi, M.; Rezaie, F.; Meshkani, S. A.

    2014-04-01

    The objective of the current study is to evaluate the seismic vulnerability of school buildings in Tehran city based on the analytic hierarchy process (AHP) and geographical information system (GIS). To this end, the peak ground acceleration, slope, and soil liquefaction layers were utilized for developing a geotechnical map. Also, the construction materials of structures, age of construction, the quality, and the seismic resonance coefficient layers were defined as major factors affecting the structural vulnerability of school buildings. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion via pairwise comparison in all levels. Finally, the geotechnical and structural spatial layers were overlaid to develop the seismic vulnerability map of school buildings in Tehran. The results indicated that only in 72 (about 3%) out of 2125 school buildings of the study area will the destruction rate be very high and therefore their reconstruction should seriously be considered.

  5. Seismic vulnerability assessment of school buildings in Tehran city based on AHP and GIS

    NASA Astrophysics Data System (ADS)

    Panahi, M.; Rezaie, F.; Meshkani, S. A.

    2013-09-01

    The objective of the study was to evaluate the seismic vulnerability of school buildings in Tehran city based on analytical hierarchical process (AHP) and geographical information systems (GIS). Therefore, to this end, the peak ground acceleration, slope and soil liquefaction layers were used for preparation geotechnical map. Also, the construction materials of structures, year of construction, their quality and seismic resonance coefficient layers were defined as major affecting factors in structural vulnerability of schools. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion through pair wise comparison in all levels. Finally, geotechnical and structural spatial layers were overlaid to prepare the seismic vulnerability map of school buildings in Tehran city. The results indicated that only in 72 schools (about 3%) out of 2125 schools in the study area, the destruction rate is very high and therefore their reconstruction should be considered.

  6. A S.M.A.R.T. system for the seismic vulnerability mitigation of Cultural Heritages

    NASA Astrophysics Data System (ADS)

    Montuori, Antonio; Costanzo, Antonio; Gaudiosi, Iolanda; Vecchio, Antonio; Minasi, Mario; Falcone, Sergio; La Piana, Carmelo; Stramondo, Salvatore; Casula, Giuseppe; Giovanna Bianchi, Maria; Fabrizia Buongiorno, Maria; Musacchio, Massimo; Doumaz, Fawzi; Ilaria Pannaccione Apa, Maria

    2016-04-01

    Both assessment and mitigation of seismic vulnerability connected to cultural heritages monitoring are non-trivial issues, based on the knowledge of structural and environmental factors potential impacting the cultural heritage. A holistic approach could be suitable to provide an effective monitoring of cultural heritages within their surroundings at different spatial and temporal scales. On the one hand, the analysis about geometrical and structural properties of monuments is important to assess their state of conservation, their response to external stresses as well as anomalies related to natural and/or anthropogenic phenomena (e.g. the aging of materials, seismic stresses, vibrational modes). On the other hand, the investigation of the surrounding area is relevant to assess environmental properties and natural phenomena (e.g. landslides, earthquakes, subsidence, seismic response) as well as their related impacts on the monuments. Within such a framework, a multi-disciplinary system has been developed and here presented for the monitoring of cultural heritages for seismic vulnerability assessment and mitigation purposes*. It merges geophysical investigations and modeling, in situ measurements and multi-platforms remote sensing sensors for the non-destructive and non-invasive multi-scales monitoring of historic buildings in a seismic-prone area. In detail, the system provides: a) the long-term and the regional-scale analysis of buildings' environment through the integration of seismogenic analysis, airborne magnetic surveys, space-borne Synthetic Aperture Radar (SAR) and multi-spectral sensors. They allow describing the sub-surface fault systems, the surface deformation processes and the land use mapping of the regional-scale area on an annual temporal span; b) the short-term and the basin-scale analysis of building's neighborhood through geological setting and geotechnical surveys, airborne Light Detection And Radar (LiDAR) and ground-based SAR sensors. They

  7. Vulnerability of populations and man-made facilities to seismic hazards

    NASA Astrophysics Data System (ADS)

    Badal, J.; Vazquez-Prada, M.; Gonzalez, A.; Chourak, M.; Samardzhieva, E.; Zhang, Z.

    2003-04-01

    Earthquakes become major societal risks when they impinge on vulnerable populations. According to the available worldwide data during the twentieth century (NEIC Catalog of Earthquakes 1980-1999), almost half a thousand of earthquakes resulted in more than 1,615,000 human victims. Besides human casualty levels, destructive earthquakes frequently inflict huge economic losses. An additional problem of very different nature, but also worthy of being considered in a damage and loss analysis, is the direct cost associated with the damages derived from a strong seismic impact. We focus our attention on both aspects to their rapid quantitative assessment, and to lessen the earthquake disaster in areas affected by relatively strong earthquakes. Our final goal is the knowledge of potential losses from earthquakes to forward national programs in emergency management, and consequently the minimization of the life loss due to earthquakes, and to aid in response and recovery tasks. For this purpose we follow a suitable and comprehensible methodology for risk-based loss analysis, and simulate the occurence of a seismic event in densely populated areas of Spain.

  8. Review on Rapid Seismic Vulnerability Assessment for Bulk of Buildings

    NASA Astrophysics Data System (ADS)

    Nanda, R. P.; Majhi, D. R.

    2013-09-01

    This paper provides a brief overview of rapid visual screening (RVS) procedures available in different countries with a comparison among all the methods. Seismic evaluation guidelines from, USA, Canada, Japan, New Zealand, India, Europe, Italy, UNDP, with other methods are reviewed from the perspective of their applicability to developing countries. The review shows clearly that some of the RVS procedures are unsuited for potential use in developing countries. It is expected that this comparative assessment of various evaluation schemes will help to identify the most essential components of such a procedure for use in India and other developing countries, which is not only robust, reliable but also easy to use with available resources. It appears that Federal Emergency Management Agency (FEMA) 154 and New Zealand Draft Code approaches can be suitably combined to develop a transparent, reasonably rigorous and generalized procedure for seismic evaluation of buildings in developing countries.

  9. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Konecny, Pavel; Lednicka, Marketa; Bednárik, Martin; Brimich, Ladislav

    2015-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, but infrastructure costs for addressing these hazards are huge as well. Furthermore, seismic hazard is only one of the many hazards facing society. Societal means need to be distributed in a reasonable manner - to assure that all of these hazards (natural as well as societal) are addressed appropriately. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test PSHA models is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occuring over time intervals that are at least as long as a seismic cycle - if that exists. Such information would be very valuable, even if it concerned only a single site, namely that of a particularly sensitive infrastructure. Such a request may seem hopeless - but it is not. Long-term information can in principle be gained from intact stalagmites in natural caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on case studies in Austria, which has moderate seismicity, but a well-documented history of major earthquake-induced damage, e.g., Villach in 1348 and 1690, Vienna in 1590, Leoben in 1794, and Innsbruck in 1551, 1572, and 1589. Seismic intensities have reached levels up to 10. It is clearly important to know which "worst-case" damages to expect. We have identified sets of particularly sensitive stalagmites in the general vicinity of two major cities in

  10. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  11. Generalized seismic analysis

    NASA Astrophysics Data System (ADS)

    Butler, Thomas G.

    1993-09-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  12. Generalized seismic analysis

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.

    1993-01-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  13. Interactive Vulnerability Analysis Enhancement Results

    DTIC Science & Technology

    2012-12-01

    PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD- MM -YYYY) DECEMBER 2012 2. REPORT TYPE FINAL TECHNICAL REPORT 3. DATES...application. Here is a screenshot of IAST finding a Cross-Site Scripting (XSS) vulnerability in a “ Hello World” Scala application: Approved for Public

  14. Application of PRA to HEMP vulnerability analysis

    SciTech Connect

    Mensing, R.W.

    1985-09-01

    Vulnerability analyses of large systems, e.g., control and communication centers, aircraft, ships, are subject to many uncertainties. A basic source of uncertainty is the random variation inherent in the physical world. Thus, vulnerability is appropriately described by an estimate of the probability of survival (or failure). A second source of uncertainty that also needs to be recognized is the uncertainty associated with the analysis or estimation process itself. This uncertainty, often called modeling uncertainty, has many contributors. There are the approximations introduced by using mathematical models to describe reality. Also, the appropriate values of the model parameters are derived from several sources, e.g., based on experimental or test data, based on expert judgment and opinion. In any case, these values are subject to uncertainty. This uncertainty must be considered in the description of vulnerability. Thus, the estimate of the probability of survival is not a single value but a range of values. Probabilistic risk analysis (PRA) is a methodology which deals with these uncertainty issues. This report discusses the application of PRA to HEMP vulnerability analyses. Vulnerability analysis and PRA are briefly outlined and the need to distinguish between random variation and modeling uncertainty is discussed. Then a sequence of steps appropriate for applying PRA to vulnerability problems is outlined. Finally, methods for handling modeling uncertainty are identified and discussed.

  15. Uncertainty Management in Seismic Vulnerability Assessment Using Granular Computing Based on Covering of Universe

    NASA Astrophysics Data System (ADS)

    Khamespanah, F.; Delavar, M. R.; Zare, M.

    2013-05-01

    Earthquake is an abrupt displacement of the earth's crust caused by the discharge of strain collected along faults or by volcanic eruptions. Earthquake as a recurring natural cataclysm has always been a matter of concern in Tehran, capital of Iran, as a laying city on a number of known and unknown faults. Earthquakes can cause severe physical, psychological and financial damages. Consequently, some procedures should be developed to assist modelling the potential casualties and its spatial uncertainty. One of these procedures is production of seismic vulnerability maps to take preventive measures to mitigate corporeal and financial losses of future earthquakes. Since vulnerability assessment is a multi-criteria decision making problem depending on some parameters and expert's judgments, it undoubtedly is characterized by intrinsic uncertainties. In this study, it is attempted to use Granular computing (GrC) model based on covering of universe to handle the spatial uncertainty. Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between

  16. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  17. Measuring Road Network Vulnerability with Sensitivity Analysis

    PubMed Central

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  18. Vulnerability

    NASA Technical Reports Server (NTRS)

    Taback, I.

    1979-01-01

    The discussion of vulnerability begins with a description of some of the electrical characteristics of fibers before definiting how vulnerability calculations are done. The vulnerability results secured to date are presented. The discussion touches on post exposure vulnerability. After a description of some shock hazard work now underway, the discussion leads into a description of the planned effort and some preliminary conclusions are presented.

  19. Information systems vulnerability: A systems analysis perspective

    SciTech Connect

    Wyss, G.D.; Daniel, S.L.; Schriner, H.K.; Gaylor, T.R.

    1996-07-01

    Vulnerability analyses for information systems are complicated because the systems are often geographically distributed. Sandia National Laboratories has assembled an interdisciplinary team to explore the applicability of probabilistic logic modeling (PLM) techniques (including vulnerability and vital area analysis) to examine the risks associated with networked information systems. The authors have found that the reliability and failure modes of many network technologies can be effectively assessed using fault trees and other PLM methods. The results of these models are compatible with an expanded set of vital area analysis techniques that can model both physical locations and virtual (logical) locations to identify both categories of vital areas simultaneously. These results can also be used with optimization techniques to direct the analyst toward the most cost-effective security solution.

  20. Aircraft vulnerability analysis by modeling and simulation

    NASA Astrophysics Data System (ADS)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  1. Seismic vulnerability of leaning masonry towers located in Emilia-Romagna region, Italy:FE analyses of four case studies

    NASA Astrophysics Data System (ADS)

    Milani, Gabriele; Shehu, Rafael; Valente, Marco

    2016-12-01

    Four inclined masonry towers are investigated in this paper in order to study the behavior under horizontal loads and the role of inclination on the seismic vulnerability. The towers are located in the North-East of Italy, a moderate seismicity region that was recently stricken by an earthquake with two major seismic events of magnitude 5.8 and 5.9. These towers date back to four to nine centuries ago and are well representative of the towers of the region. They present a meaningful inclination cumulated over years, which has a significant influence on the bearing capacity under lateral loads. Some retrofitting interventions were recently carried out by introducing tendons and hooping systems in order to ensure a box behavior and preclude the spreading of dangerous cracks due to the insufficient tensile strength of the masonry material. The structural behavior of the towers under horizontal loads is influenced by some geometrical issues, such as slenderness, walls thickness, perforations, irregularities, but the main aim of the paper is to provide an insight on the role played by inclination. The case studies are chosen to exhibit different values of slenderness in order to include a large range of geometrical cases for the assessment of the seismic vulnerability of the towers. Numerical analyses are carried out considering the effects of the retrofitting interventions too. As expected, pushover analyses show that inclination may increase the seismic vulnerability of the masonry towers comparing the results obtained for the inclined real case and the hypothetical vertical case.

  2. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  3. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  4. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  5. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  6. Static analysis assisted vulnerability-oriented evolutionary fuzzing

    NASA Astrophysics Data System (ADS)

    Yang, Gang; Feng, Chao; Tang, Chaojing

    2017-03-01

    The blindness of fuzzing test is the main reason for its inefficiency in practical vulnerability discovery. In this paper, we proposed a static analysis assisted and vulnerability-oriented fuzzing testing technology. Through the static analysis, the suspect vulnerable code areas can be located roughly. And combined with dynamics update based on the fuzzing feedback, the vulnerable code areas can be further accurately located. By applying the distances to the located vulnerable code areas as one of metrics of the evolutionary fuzzing test, the vulnerability-oriented test cases are generated. The results in the experiment showed that the method proposed in this paper could effectually improve the vulnerability discovering efficiency of fuzzing test.

  7. AVVAM-1 (Armored Vehicle Vulnerability Analysis Model) and Tank Vulnerability Sensitivity Studies

    DTIC Science & Technology

    1973-01-01

    Ground , Maryland 0 Introduction I AVVAM-I ( Armored Vehicle Vulnerability Analysis Model -first version) is a conceptuil model-and associateS digital... measure of the ballistic shielding provided by a tank component. *The higher this value the harder it is for a spall fragment to perforate the component...in vulnerability to ballistic damage from behind-the- armor frag- ments and in the ability of noncritical components to provide ballistic shielding for

  8. Verification the data on critical facilities inventory and vulnerability for seismic risk assessment taking into account possible accidents

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Aleksander

    2015-04-01

    The paper contains the results of the recent study that has been done by Seismological Center of IGE, Russian Academy of Sciences and Extreme Situations Research Center within the Russian Academy of Sciences Project "Theoretical and Methodological basis for seismic risk assessment taking into account technological accidents at local level; constructing the seismic risk maps for the Big Sochi City territory including the venue of Olympic Games facilities." The procedure of critical facilities inventory and vulnerability verification which makes use of space images and web technologies in social networks is presented. The numerical values of the criteria of accidents at fire and chemical hazardous facilities triggered by strong earthquakes are obtained. The seismic risk maps for Big Sochi City territory including the Olympic Games venue constructed taking into account new data on critical facilities obtained with application panorama photos of these facilities, space images of high resolution and web technologies. The obtained values of individual seismic risk taking into account secondary technological accidents exceed the values seismic risk without taking secondary hazard, return period T= 500 years, at 0.5-1.0 10-51/year.

  9. Seismic Analysis of Intake Towers

    DTIC Science & Technology

    1982-10-01

    AREA & WORK UNIT NUMBERS Structures Laboratory Structural Engineering P. 0. Box 631, Vicksburg, Miss. 39180 Research Work Unit 31588 It. CONTROLLING ...AGENCY NAME & ADDRESS(if ditffrsnt frost Controlling Office) IS. SECURITY CLASS. (of this report) Unclassified I4a. DECL ASSIFI CATION/DOWN GRADI NO...needed for a controlled release of the reservoir to repair any seismic damage in the damming structure. The high cost associated with these criteria for a

  10. Global analysis of urban surface water supply vulnerability

    NASA Astrophysics Data System (ADS)

    Padowski, Julie C.; Gorelick, Steven M.

    2014-10-01

    This study presents a global analysis of urban water supply vulnerability in 71 surface-water supplied cities, with populations exceeding 750 000 and lacking source water diversity. Vulnerability represents the failure of an urban supply-basin to simultaneously meet demands from human, environmental and agricultural users. We assess a baseline (2010) condition and a future scenario (2040) that considers increased demand from urban population growth and projected agricultural demand. We do not account for climate change, which can potentially exacerbate or reduce urban supply vulnerability. In 2010, 35% of large cities are vulnerable as they compete with agricultural users. By 2040, without additional measures 45% of cities are vulnerable due to increased agricultural and urban demands. Of the vulnerable cities in 2040, the majority are river-supplied with mean flows so low (1200 liters per person per day, l/p/d) that the cities experience ‘chronic water scarcity’ (1370 l/p/d). Reservoirs supply the majority of cities facing individual future threats, revealing that constructed storage potentially provides tenuous water security. In 2040, of the 32 vulnerable cities, 14 would reduce their vulnerability via reallocating water by reducing environmental flows, and 16 would similarly benefit by transferring water from irrigated agriculture. Approximately half remain vulnerable under either potential remedy.

  11. Seismic analysis of a vacuum vessel

    SciTech Connect

    Chen, W.W.

    1993-01-01

    This paper presents the results of the seismic analysis for the preliminary design of a vacuum vessel for the ground engineering system (GES) of the SP-100 project. It describes the method of calculating the elevated seismic response spectra at various levels within the vacuum vessel using the simplified computer code developed by Weiner. A modal superposition analysis under design response spectra loading was performed for a three-dimensional finite-element model using the general-purpose finite-element computer code ANSYS. The in-vessel elevated seismic response spectra at various levels in the vacuum vessel, along with vessel mode shapes and frequencies are presented. Also included are descriptions of the results of the modal analyses for some significant preliminary design points at various elevations of the vessel.

  12. Transverse seismic analysis of buried pipelines

    SciTech Connect

    Mavridis, G.A.; Pitilakis, K.D.

    1995-12-31

    The objective of this study is to develop an analytical procedure for calculating upper bounds for stresses and strains for the case of the transverse seismic shaking of continuous buried pipelines taking into account the soil-pipeline interaction effects. A sensibility analysis of some critical parameters is performed. The influence of various parameters such as the apparent propagation velocity, the frequency content of the seismic ground excitation, the dynamic soil properties, the pipe`s material and size, on the ratio of the pipe to ground displacements amplitudes and consequently to the induced pipe strains, are studied parametrically.

  13. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  14. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  15. Information Systems Vulnerability: A Systems Analysis Perspective

    DTIC Science & Technology

    1996-06-01

    viewgraph, video ,electronicposting(Internet, WorldWideWeb, external network), etc.) Information S stems Vulnerability : A S st . .Ctlve Sandiaauthor...12614 (8815) for videos ; Interactive Media 12616 (8815) for multimedia products or technical artwork; Corporate Exhibits 12613 (8815) for exhibits.) DOE...Conference Paper (3 copies) ❑ Exhibit/Display/Poster ❑ Publication ❑ SlidesNiewgraphs ❑ .Audio/ Video /Film ❑ Electronic Posting ❑ Journal Article

  16. Vulnerability analysis for design of bridge health monitoring system

    NASA Astrophysics Data System (ADS)

    Sun, L. M.; Yu, G.

    2010-03-01

    The recent engineering implementation of health monitoring system for long span bridges show difficulties for precisely assessing structural physical condition as well as for accurately alarming on structural damages, although hundreds of sensors were installed on a structure and a great amount of data were collected from the monitoring system. The allocation of sensors and the alarming algorithm are still two of the most important tasks to be considered when designing the structural health monitoring system. Vulnerability, in its original meaning, is the system susceptibility to local damage. For a structural system, the vulnerability can thus be regarded as structural performance susceptibility to local damage of structure. The purpose of this study is to propose concepts and methods of structural vulnerability for determining monitoring components which are more vulnerable than others and the corresponding warning threshold once the damages occur. The structural vulnerability performances to various damage scenarios depend upon structural geometrical topology, loading pattern on the structure and the degradation of component performance. A two-parameters structural vulnerability evaluation method is proposed in this paper. The parameters are the damage consequence and the relative magnitude of the damage scenarios to the structural system, respectively. Structural vulnerability to various damage scenarios can be regarded as the tradeoff between the two parameters. Based on the results of structural vulnerability analysis, the limited structural information from health monitoring can be utilized efficiently. The approach of the design of bridge health monitoring system is illustrated for a cable-stayed bridge.

  17. Seismic Vulnerability Assessment Waste Characterization Reduction and Repackaging Building, TA-50-69

    SciTech Connect

    M.W.Sullivan; J.Ruminer; I.Cuesta

    2003-02-02

    This report presents the results of the seismic structural analyses completed on the Waste Characterization Reduction and Repackaging (WCRR) Building in support of ongoing safety analyses. WCRR is designated as TA-50-69 at Los Alamos National Laboratory, Los Alamos, New Mexico. The facility has been evaluated against Department of Energy (DOE) seismic criteria for Natural Phenomena Hazards (NPH) Performance Category II (PC 2). The seismic capacities of two subsystems within the WCRR building, the material handling glove box and the lift rack immediately adjacent to the Glove Box are also documented, and the results are presented.

  18. Seismic vulnerability of the Himalayan half-dressed rubble stone masonry structures, experimental and analytical studies

    NASA Astrophysics Data System (ADS)

    Ahmad, N.; Ali, Q.; Ashraf, M.; Alam, B.; Naeem, A.

    2012-11-01

    Half-Dressed rubble stone (DS) masonry structures as found in the Himalayan region are investigated using experimental and analytical studies. The experimental study included a shake table test on a one-third scaled structural model, a representative of DS masonry structure employed for public critical facilities, e.g. school buildings, offices, health care units, etc. The aim of the experimental study was to understand the damage mechanism of the model, develop damage scale towards deformation-based assessment and retrieve the lateral force-deformation response of the model besides its elastic dynamic properties, i.e. fundamental vibration period and elastic damping. The analytical study included fragility analysis of building prototypes using a fully probabilistic nonlinear dynamic method. The prototypes are designed as SDOF systems assigned with lateral, force-deformation constitutive law (obtained experimentally). Uncertainties in the constitutive law, i.e. lateral stiffness, strength and deformation limits, are considered through random Monte Carlo simulation. Fifty prototype buildings are analyzed using a suite of ten natural accelerograms and an incremental dynamic analysis technique. Fragility and vulnerability functions are derived for the damageability assessment of structures, economic loss and casualty estimation during an earthquake given the ground shaking intensity, essential within the context of risk assessment of existing stock aiming towards risk mitigation and disaster risk reduction.

  19. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  20. A Methodology For Flood Vulnerability Analysis In Complex Flood Scenarios

    NASA Astrophysics Data System (ADS)

    Figueiredo, R.; Martina, M. L. V.; Dottori, F.

    2015-12-01

    Nowadays, flood risk management is gaining importance in order to mitigate and prevent flood disasters, and consequently the analysis of flood vulnerability is becoming a key research topic. In this paper, we propose a methodology for large-scale analysis of flood vulnerability. The methodology is based on a GIS-based index, which considers local topography, terrain roughness and basic information about the flood scenario to reproduce the diffusive behaviour of floodplain flow. The methodology synthetizes the spatial distribution of index values into maps and curves, used to represent the vulnerability in the area of interest. Its application allows for considering different levels of complexity of flood scenarios, from localized flood defence failures to complex hazard scenarios involving river reaches. The components of the methodology are applied and tested in two floodplain areas in Northern Italy recently affected by floods. The results show that the methodology can provide an original and valuable insight of flood vulnerability variables and processes.

  1. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  2. Reservoir permeability from seismic attribute analysis

    SciTech Connect

    Silin, Dmitriy; Goloshubin, G.; Silin, D.; Vingalov, V.; Takkand, G.; Latfullin, M.

    2008-02-15

    In case of porous fluid-saturated medium the Biot's poroelasticity theory predicts a movement of the pore fluid relative to the skeleton on seismic wave propagation through the medium. This phenomenon opens an opportunity for investigation of the flow properties of the hydrocarbon-saturated reservoirs. It is well known that relative fluid movement becomes negligible at seismic frequencies if porous material is homogeneous and well cemented. In this case the theory predicts an underestimated seismic wave velocity dispersion and attenuation. Based on Biot's theory, Helle et al. (2003) have numerically demonstrated the substantial effects on both velocity and attenuation by heterogeneous permeability and saturation in the rocks. Besides fluid flow effect, the effects of scattering (Gurevich, et al., 1997) play very important role in case of finely layered porous rocks and heterogeneous fluid saturation. We have used both fluid flow and scattering effects to derive a frequency-dependent seismic attribute which is proportional to fluid mobility and applied it for analysis of reservoir permeability.

  3. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites for the surroundings of Katerloch cave, Austria

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Kalmár, János

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Such information would be very valuable, even if it concerned only a single site. Long-term information can in principle be gained from intact stalagmites in natural karstic caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on a case study from the Katerloch cave close to the city of Graz, Austria. A specially-shaped (candle stick style: high, slim, and more or less cylindrical form) intact and vulnerable stalagmites (IVSTM) in the Katerloch cave has been examined in 2013 and 2014. This IVSTM is suitable for estimating the upper limit for horizontal peak ground acceleration generated by pre-historic earthquakes. For this cave, we have extensive information about ages (e.g., Boch et al., 2006, 2010). The approach, used in our study, yields significant new constraints on seismic hazard, as the intactness of the stalagmites suggests that tectonic structures close to Katerloch cave, i.p. the Mur-Mürz fault did not generate very strong paleoearthquakes in the last few thousand years

  4. GIS modelling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran city using multi-criteria decision-making (MCDM) techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2014-09-01

    The main issue in determining the seismic vulnerability is having a comprehensive view to all probable damages related to earthquake occurrence. Therefore, taking factors such as peak ground acceleration (PGA) in the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education, the physical distance to a hospitals (or medical care centers), etc. into account and categorized under four indicators of geotechnical, structural, social and physical distance to needed facilities and distance from dangerous ones will provide us with a better and more exact outcome. To this end in this paper using analytic hierarchy process (AHP), the amount of importance of criteria or alternatives are determined and using geographical information system (GIS), the vulnerability of Tehran metropolitan as a result of an earthquake, is studied. This study focuses on the fact that Tehran is surrounded by three active and major faults of the Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran city is analysed and classified into four levels including high, medium, low and safe. The results show that regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey respectively make 6, 16 and 10% of Tehran area highly vulnerable and also 34, 14 and 27% are safe.

  5. GIS modeling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran using multi-criteria decision-making techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2015-03-01

    The main issue in determining seismic vulnerability is having a comprehensive view of all probable damages related to earthquake occurrence. Therefore, taking into account factors such as peak ground acceleration at the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education and the physical distance to hospitals (or medical care centers) and categorizing them into four indicators of geotechnical, structural, social and physical distance to needed facilities and from dangerous ones will provide us with a better and more exact outcome. To this end, this paper uses the analytic hierarchy process to study the importance of criteria or alternatives and uses the geographical information system to study the vulnerability of Tehran to an earthquake. This study focuses on the fact that Tehran is surrounded by three active and major faults: Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran is analyzed and classified into four levels: high, medium, low and safe. The results show that, regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey make, respectively, 6, 16 and 10% of Tehran highly vulnerable, while 34, 14 and 27% is safe.

  6. WHE-PAGER Project: A new initiative in estimating global building inventory and its seismic vulnerability

    USGS Publications Warehouse

    Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig

    2008-01-01

    The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

  7. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  8. Pattern dynamics analysis of seismic catalogs

    NASA Astrophysics Data System (ADS)

    Tiampo, K.; Rundle, J.; Klein, W.; McGinnis, S.; Posadas, A.; Fernàndez, J.; Luzòn, F.

    2003-04-01

    The historical earthquake record, while not complete, spans hundreds to thousands of years of human history. As a result, large, extended fault systems such as those in California are known to demonstrate complex space-time seismicity patterns, which include, but are not limited to, repetitive events, precursory activity and quiescence, and aftershock sequences ((Mogi, 1969; Keilis-Borok et al., 1980; Kanamori, 1981; Kagan and Jackson, 1992; Saleur et al., 1996; Ellsworth and Cole, 1997; Pollitz and Sacks, 1997; Bowman et al., 1998; Nanjo et al., 1998; Wyss and Wiemer, 1999). Although the characteristics of these patterns can be qualitatively described, a systematic quantitative analysis remains elusive (Kanamori, 1981; Turcotte, 1991; Geller et al., 1997). Here we describe a new technique, formulated based on new developments in the physical and theoretical understanding of these complex, nonlinear fault systems that isolates emergent regions of coherent, correlated seismicity (Bak and Tang, 1989; Rundle, 1989; Sornette and Sornette, 1989; Rundle and Klein, 1995; Sammis et al., 1996; 1997; Fisher et al., 1997; Jaume and Sykes, 1999; Rundle et al., 1999; Tiampo et al., 2002). Analysis of data taken prior to large events reveals that the appearance of the coherent correlated regions is often associated with the future occurrence of major earthquakes in the same areas or other tectonic mechanisms such as aseismic slip events (Tiampo et al., 2002). We proceed to detail this pattern dynamics methodology and then identify systematic space-time variations in the seismicity from several tectonic regions.

  9. Seismic Vulnerability Evaluations Within The Structural And Functional Survey Activities Of The COM Bases In Italy

    SciTech Connect

    Zuccaro, G.; Cacace, F.; Albanese, V.; Mercuri, C.; Papa, F.; Pizza, A. G.; Sergio, S.; Severino, M.

    2008-07-08

    The paper describes technical and functional surveys on COM buildings (Mixed Operative Centre). This activity started since 2005, with the contribution of both Italian Civil Protection Department and the Regions involved. The project aims to evaluate the efficiency of COM buildings, checking not only structural, architectonic and functional characteristics but also paying attention to surrounding real estate vulnerability, road network, railways, harbours, airports, area morphological and hydro-geological characteristics, hazardous activities, etc. The first survey was performed in eastern Sicily, before the European Civil Protection Exercise 'EUROSOT 2005'. Then, since 2006, a new survey campaign started in Abruzzo, Molise, Calabria and Puglia Regions. The more important issue of the activity was the vulnerability assessment. So this paper deals with a more refined vulnerability evaluation technique by means of the SAVE methodology, developed in the 1st task of SAVE project within the GNDT-DPC programme 2000-2002 (Zuccaro, 2005); the SAVE methodology has been already successfully employed in previous studies (i.e. school buildings intervention programme at national scale; list of strategic public buildings in Campania, Sicilia and Basilicata). In this paper, data elaborated by SAVE methodology are compared with expert evaluations derived from the direct inspections on COM buildings. This represents a useful exercise for the improvement either of the survey forms or of the methodology for the quick assessment of the vulnerability.

  10. Vulnerability Analysis Considerations for the Transportation of Special Nuclear Material

    SciTech Connect

    Nicholson, Lary G.; Purvis, James W.

    1999-07-21

    The vulnerability analysis methodology developed for fixed nuclear material sites has proven to be extremely effective in assessing associated transportation issues. The basic methods and techniques used are directly applicable to conducting a transportation vulnerability analysis. The purpose of this paper is to illustrate that the same physical protection elements (detection, delay, and response) are present, although the response force plays a dominant role in preventing the theft or sabotage of material. Transportation systems are continuously exposed to the general public whereas the fixed site location by its very nature restricts general public access.

  11. An approach to extend seismic vulnerability relationships for large diameter pipelines

    SciTech Connect

    Honegger, D.G.

    1995-12-31

    The most common approach to determining vulnerability is to rely solely upon damage data from past earthquakes as a predictor of future performance. Relying upon past damage data is not an option when data does not exist for a particular type of pipeline. An option discussed in this paper and recently implemented for a large diameter water supply pipelines, relies upon engineering characterization of the relative strength of pipelines to extend existing damage data.

  12. New Codes for Ambient Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Duret, F.; Mooney, W. D.; Detweiler, S.

    2007-12-01

    In order to determine a velocity model of the crust, scientists generally use earthquakes recorded by seismic stations. However earthquakes do not occur continuously and most are too weak to be useful. When no event is recorded, a waveform is generally considered to be noise. This noise, however, is not useless and carries a wealth of information. Thus, ambient seismic noise analysis is an inverse method of investigating the Earth's interior. Until recently, this technique was quite difficult to apply, as it requires significant computing capacities. In early 2007, however, a team led by Gregory Benson and Mike Ritzwoller from UC Boulder published a paper describing a new method for extracting group and phase velocities from those waveforms. The analysis consisting of recovering Green functions between a pair of stations, is composed of four steps: 1) single station data preparation, 2) cross-correlation and stacking, 3) quality control and data selection and 4) dispersion measurements. At the USGS, we developed a set of ready-to-use computing codes for analyzing waveforms to run the ambient noise analysis of Benson et al. (2007). Our main contribution to the analysis technique was to fully automate the process. The computation codes were written in Fortran 90 and the automation scripts were written in Perl. Furthermore, some operations were run with SAC. Our choices of programming language offer an opportunity to adapt our codes to the major platforms. The codes were developed under Linux but are meant to be adapted to Mac OS X and Windows platforms. The codes have been tested on Southern California data and our results compare nicely with those from the UC Boulder team. Next, we plan to apply our codes to Indonesian data, so that we might take advantage of newly upgraded seismic stations in that region.

  13. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    NASA Astrophysics Data System (ADS)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  14. Cascade vulnerability for risk analysis of water infrastructure.

    PubMed

    Sitzenfrei, R; Mair, M; Möderl, M; Rauch, W

    2011-01-01

    One of the major tasks in urban water management is failure-free operation for at least most of the time. Accordingly, the reliability of the network systems in urban water management has a crucial role. The failure of a component in these systems impacts potable water distribution and urban drainage. Therefore, water distribution and urban drainage systems are categorized as critical infrastructure. Vulnerability is the degree to which a system is likely to experience harm induced by perturbation or stress. However, for risk assessment, we usually assume that events and failures are singular and independent, i.e. several simultaneous events and cascading events are unconsidered. Although failures can be causally linked, a simultaneous consideration in risk analysis is hardly considered. To close this gap, this work introduces the term cascade vulnerability for water infrastructure. Cascade vulnerability accounts for cascading and simultaneous events. Following this definition, cascade risk maps are a merger of hazard and cascade vulnerability maps. In this work cascade vulnerability maps for water distribution systems and urban drainage systems based on the 'Achilles-Approach' are introduced and discussed. It is shown, that neglecting cascading effects results in significant underestimation of risk scenarios.

  15. Betweenness as a Tool of Vulnerability Analysis of Power System

    NASA Astrophysics Data System (ADS)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-12-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  16. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  17. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  18. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and

  19. Analytical and Experimental Assessment of Seismic Vulnerability of Beam-Column Joints without Transverse Reinforcement in Concrete Buildings

    NASA Astrophysics Data System (ADS)

    Hassan, Wael Mohammed

    Beam-column joints in concrete buildings are key components to ensure structural integrity of building performance under seismic loading. Earthquake reconnaissance has reported the substantial damage that can result from inadequate beam-column joints. In some cases, failure of older-type corner joints appears to have led to building collapse. Since the 1960s, many advances have been made to improve seismic performance of building components, including beam-column joints. New design and detailing approaches are expected to produce new construction that will perform satisfactorily during strong earthquake shaking. Much less attention has been focused on beam-column joints of older construction that may be seismically vulnerable. Concrete buildings constructed prior to developing details for ductility in the 1970s normally lack joint transverse reinforcement. The available literature concerning the performance of such joints is relatively limited, but concerns about performance exist. The current study aimed to improve understanding and assessment of seismic performance of unconfined exterior and corner beam-column joints in existing buildings. An extensive literature survey was performed, leading to development of a database of about a hundred tests. Study of the data enabled identification of the most important parameters and the effect of each parameter on the seismic performance. The available analytical models and guidelines for strength and deformability assessment of unconfined joints were surveyed and evaluated. In particular, The ASCE 41 existing building document proved to be substantially conservative in joint shear strength estimation. Upon identifying deficiencies in these models, two new joint shear strength models, a bond capacity model, and two axial capacity models designed and tailored specifically for unconfined beam-column joints were developed. The proposed models strongly correlated with previous test results. In the laboratory testing phase of

  20. Central Anatolian Seismic Network: Initial Analysis of the Seismicity and Earth Structure

    NASA Astrophysics Data System (ADS)

    Arda Özacar, A.; Abgarmi, Bizhan; Delph, Jonathan; Beck, Susan L.; Sandvol, Eric; Türkelli, Niyazi; Kalafat, Doğan; Kahraman, Metin; Teoman, Uğur

    2015-04-01

    Anatolian Microplate provides many of the clues to understand the geodynamic processes leading to continental collision, plateau formation, slab tearing / break-off and development of escape tectonics. During last decades, the tectonic evolution and dynamics of Anatolia has been the prime target of numerous research efforts employing wide spectrum of disciplines. However the Anatolian interior which is characterized by large magnitude lateral and vertical displacements, widespread Neogene volcanism and a complex tectonic history, is still under much debate and require a joint multidisciplinary approach to investigate the mantle-to-surface dynamics. In order to identify the crust and mantle structure beneath Central Anatolia and related seismicity, a dense seismic array that consists of 70 broadband seismic stations was deployed temporarily in 2013 as a part of the Central Anatolian Tectonics (CAT) project on continental dynamics. A year of seismic record has been processed and part of it was analyzed using various seismic methods. Distribution of preliminary earthquake locations supports the presence of seismic activity partly localized along major tectonic structures across the region. According ambient noise tomography results, upper crustal seismic velocity variations correlate well with surface geology while slow shear wave velocities dominate the lower crust indicating a weaker crustal rheology at the bottom. Furthermore, analysis of teleseismic P wave receiver functions revealed the presence of crustal low velocity zones associated to Neogene volcanism and sharp Moho variations near tectonic sutures and faults. By combining this new dataset with seismic data recorded by previous seismic deployments and national networks, we will have a complete seismic coverage for the entire region allowing researchers to image beneath Anatolia from mantle to surface with high resolution.

  1. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    NASA Astrophysics Data System (ADS)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  2. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Coleman, Justin; Sabharwall, Piyush

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  3. Key parameter optimization and analysis of stochastic seismic inversion

    NASA Astrophysics Data System (ADS)

    Huang, Zhe-Yuan; Gan, Li-Deng; Dai, Xiao-Feng; Li, Ling-Gao; Wang, Jun

    2012-03-01

    Stochastic seismic inversion is the combination of geostatistics and seismic inversion technology which integrates information from seismic records, well logs, and geostatistics into a posterior probability density function (PDF) of subsurface models. The Markov chain Monte Carlo (MCMC) method is used to sample the posterior PDF and the subsurface model characteristics can be inferred by analyzing a set of the posterior PDF samples. In this paper, we first introduce the stochastic seismic inversion theory, discuss and analyze the four key parameters: seismic data signal-to-noise ratio (S/N), variogram, the posterior PDF sample number, and well density, and propose the optimum selection of these parameters. The analysis results show that seismic data S/N adjusts the compromise between the influence of the seismic data and geostatistics on the inversion results, the variogram controls the smoothness of the inversion results, the posterior PDF sample number determines the reliability of the statistical characteristics derived from the samples, and well density influences the inversion uncertainty. Finally, the comparison between the stochastic seismic inversion and the deterministic model based seismic inversion indicates that the stochastic seismic inversion can provide more reliable information of the subsurface character.

  4. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    SciTech Connect

    Kersulis, Jonas; Hiskens, Ian; Chertkov, Michael; Backhaus, Scott N.; Bienstock, Daniel

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  5. Vulnerability Analysis of the Player Command and Control Protocol

    DTIC Science & Technology

    2012-06-14

    reflect the official policy or position of the United States Air Force, Department of Defense, or the United States Government . This material is...declared a work of the U.S. Government and is not subject to copyright protection in the Cnited States. AFIT/ GCO/ ENG/ 12-16 VULNERABILITY ANALYSIS...Servers (P) and Clients (C) [GVS011 A Player server listens on TCP port 6665 for incoming client connections [GSVOO]. The server provides

  6. Nonlinear Seismic Analysis of Morrow Point Dam

    SciTech Connect

    Noble, C R; Nuss, L K

    2004-02-20

    This research and development project was sponsored by the United States Bureau of Reclamation (USBR), who are best known for the dams, power plants, and canals it constructed in the 17 western states. The mission statement of the USBR's Dam Safety Office, located in Denver, Colorado, is ''to ensure Reclamation dams do not present unacceptable risk to people, property, and the environment.'' The Dam Safety Office does this by quickly identifying the dams which pose an increased threat to the public, and quickly completing the related analyses in order to make decisions that will safeguard the public and associated resources. The research study described in this report constitutes one element of USBR's research and development work to advance their computational and analysis capabilities for studying the response of dams to strong earthquake motions. This project focused on the seismic response of Morrow Point Dam, which is located 263 km southwest of Denver, Colorado.

  7. Multi-waveform classification for seismic facies analysis

    NASA Astrophysics Data System (ADS)

    Song, Chengyun; Liu, Zhining; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-04-01

    Seismic facies analysis provides an effective way to delineate the heterogeneity and compartments within a reservoir. Traditional method is using the single waveform to classify the seismic facies, which does not consider the stratigraphy continuity, and the final facies map may affect by noise. Therefore, by defining waveforms in a 3D window as multi-waveform, we developed a new seismic facies analysis algorithm represented as multi-waveform classification (MWFC) that combines the multilinear subspace learning with self-organizing map (SOM) clustering techniques. In addition, we utilize multi-window dip search algorithm to extract multi-waveform, which reduce the uncertainty of facies maps in the boundaries. Testing the proposed method on synthetic data with different S/N, we confirm that our MWFC approach is more robust to noise than the conventional waveform classification (WFC) method. The real seismic data application on F3 block in Netherlands proves our approach is an effective tool for seismic facies analysis.

  8. Grandiose and vulnerable narcissism: a nomological network analysis.

    PubMed

    Miller, Joshua D; Hoffman, Brian J; Gaughan, Eric T; Gentile, Brittany; Maples, Jessica; Keith Campbell, W

    2011-10-01

    Evidence has accrued to suggest that there are 2 distinct dimensions of narcissism, which are often labeled grandiose and vulnerable narcissism. Although individuals high on either of these dimensions interact with others in an antagonistic manner, they differ on other central constructs (e.g., Neuroticism, Extraversion). In the current study, we conducted an exploratory factor analysis of 3 prominent self-report measures of narcissism (N=858) to examine the convergent and discriminant validity of the resultant factors. A 2-factor structure was found, which supported the notion that these scales include content consistent with 2 relatively distinct constructs: grandiose and vulnerable narcissism. We then compared the similarity of the nomological networks of these dimensions in relation to indices of personality, interpersonal behavior, and psychopathology in a sample of undergraduates (n=238). Overall, the nomological networks of vulnerable and grandiose narcissism were unrelated. The current results support the need for a more explicit parsing of the narcissism construct at the level of conceptualization and assessment.

  9. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  10. An Analysis of the Mt. Meron Seismic Array

    SciTech Connect

    Pasyanos, M E; Ryall, F

    2008-01-10

    We have performed a quick analysis of the Mt. Meron seismic array to monitor regional seismic events in the Middle East. The Meron array is the only current array in the Levant and Arabian Peninsula and, as such, might be useful in contributing to event location, identification, and other analysis. Here, we provide a brief description of the array and a review of the travel time and array analysis done to assess its performance.

  11. A transferable approach towards rapid inventory data capturing for seismic vulnerability assessment using open-source geospatial technologies

    NASA Astrophysics Data System (ADS)

    Wieland, M.; Pittore, M.; Parolai, S.; Zschau, J.

    2012-04-01

    Geospatial technologies are increasingly being used in pre-disaster vulnerability assessment and post-disaster impact assessment for different types of hazards. Especially the use of remote sensing data has been strongly promoted in recent years due to its capabilities of providing up-to-date information over large areas at a comparatively low cost with increasingly high spatial, temporal and spectral resolution. Despite its clear potentials, a purely remote sensing based approach has its limitations in that it is only capable of providing information about the birds-eye view of the objects of interest. The use of omnidirectional imaging in addition can provide the necessary street-view that furthermore allows for a rapid visual screening of a buildings façade. In this context, we propose an integrated approach to rapid inventory data capturing for the assessment of structural vulnerability of buildings in case of an earthquake. Globally available low-cost data sources are preferred and the tools are developed on an open-source basis to allow for a high degree of transferability and usability. On a neighbourhood scale medium spatial but high temporal and spectral resolution satellite images are analysed to outline areas of homogeneous urban structure. Following a proportional allocation scheme, for each urban structure type representative sample areas are selected for a more detailed analysis of the building stock with high resolution image data. On a building-by-building scale a ground-based, rapid visual survey is performed using an omnidirectional imaging system driven around with a car inside the identified sample areas. Processing of the acquired images allows for an extraction of vulnerability-related features of single buildings (e.g. building height, detection of soft-storeys). An analysis of high resolution satellite images provides with further inventory features (e.g. footprint area, shape irregularity). Since we are dealing with information coming from

  12. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  13. The SeIsmic monitoring and vulneraBilitY framework for civiL protection (SIBYL) Project: An overview and preliminary results

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Parolai, Stefano; Iervolino, Iunio; Pitilakis, Kyriazis; Petryna, Yuriy

    2016-04-01

    The SIBYL project is setting out to contribute to enhancing the capacity of Civil Protection (CP) authorities to rapidly and cost-effectively assess the seismic vulnerability of the built environment. The reason for this arises from the occurrence of seismic swarms or foreshocks, which leads to the requirement that CP authorities must rapidly assess the threatened area's vulnerability. This is especially important for those regions where there is a dearth of up-to-date and reliable information. The result will be a multi-faceted framework, made up of methodologies and software tools, that provides information to advise decision makers as to the most appropriate preventative actions to be taken. It will cover cases where there is a need for short-notice vulnerability assessment in a pre-event situation, and the monitoring of the built environment's dynamic vulnerability during a seismic sequence. Coupled with this will be the ability to stimulate long-term management plans, independent of the hazard or disaster of concern. The monitoring itself will involve low-cost sensing units which may be easily installed in critical infrastructures. The framework will be flexible enough to be employed over multiple spatial scales, and it will be developed with a modular structure which will ease its applicability to other natural hazard types. Likewise, it will be able to be adapted to the needs of CP authorities in different countries within their own hazard context. This presentation therefore provides an overview of the aims and expected outcomes of SIBYL, while explaining the tools currently being developed and refined, as well as preliminary results of several field campaigns.

  14. Stochastic seismic analysis in the Messina strait area

    SciTech Connect

    Cacciola, P.; Maugeri, N.; Muscolino, G.

    2008-07-08

    After 1908 Messina earthquake significant progresses have been carried out in the field of earthquake engineering. Usually seismic action is represented via the so called elastic response spectrum or alternatively by time histories of ground motion acceleration. Due the random nature of the seismic action, alternative representations assume the seismic action as zero-mean Gaussian process fully defined by the so-called Power Spectral Density function. Aim of this paper is the comparative study of the response of linear behaving structures adopting the above representation of the seismic action using recorded earthquakes in the Messina strait area. In this regard, a handy method for determining the power spectral density function of recorded earthquakes is proposed. Numerical examples conducted on the existing space truss located in Torre Faro (Messina) will show the effectiveness of stochastic approach for coping with the seismic analysis of structures.

  15. Stochastic seismic analysis in the Messina strait area

    NASA Astrophysics Data System (ADS)

    Cacciola, P.; Maugeri, N.; Muscolino, G.

    2008-07-01

    After 1908 Messina earthquake significant progresses have been carried out in the field of earthquake engineering. Usually seismic action is represented via the so called elastic response spectrum or alternatively by time histories of ground motion acceleration. Due the random nature of the seismic action, alternative representations assume the seismic action as zero-mean Gaussian process fully defined by the so-called Power Spectral Density function. Aim of this paper is the comparative study of the response of linear behaving structures adopting the above representation of the seismic action using recorded earthquakes in the Messina strait area. In this regard, a handy method for determining the power spectral density function of recorded earthquakes is proposed. Numerical examples conducted on the existing space truss located in Torre Faro (Messina) will show the effectiveness of stochastic approach for coping with the seismic analysis of structures.

  16. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  17. Multidimensional seismic data reconstruction using tensor analysis

    NASA Astrophysics Data System (ADS)

    Kreimer, Nadia

    Exploration seismology utilizes the seismic wavefield for prospecting oil and gas. The seismic reflection experiment consists on deploying sources and receivers in the surface of an area of interest. When the sources are activated, the receivers measure the wavefield that is reflected from different subsurface interfaces and store the information as time-series called traces or seismograms. The seismic data depend on two source coordinates, two receiver coordinates and time (a 5D volume). Obstacles in the field, logistical and economical factors constrain seismic data acquisition. Therefore, the wavefield sampling is incomplete in the four spatial dimensions. Seismic data undergoes different processes. In particular, the reconstruction process is responsible for correcting sampling irregularities of the seismic wavefield. This thesis focuses on the development of new methodologies for the reconstruction of multidimensional seismic data. This thesis examines techniques based on tensor algebra and proposes three methods that exploit the tensor nature of the seismic data. The fully sampled volume is low-rank in the frequency-space domain. The rank increases when we have missing traces and/or noise. The methods proposed perform rank reduction on frequency slices of the 4D spatial volume. The first method employs the Higher-Order Singular Value Decomposition (HOSVD) immersed in an iterative algorithm that reinserts weighted observations. The second method uses a sequential truncated SVD on the unfoldings of the tensor slices (SEQ-SVD). The third method formulates the rank reduction problem as a convex optimization problem. The measure of the rank is replaced by the nuclear norm of the tensor and the alternating direction method of multipliers (ADMM) minimizes the cost function. All three methods have the interesting property that they are robust to curvature of the reflections, unlike many reconstruction methods. Finally, we present a comparison between the methods

  18. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  19. Detecting seismic activity with a covariance matrix analysis of data recorded on seismic arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N. M.; de Rosny, J.; Brenguier, F.; Landès, M.

    2016-03-01

    Modern seismic networks are recording the ground motion continuously at the Earth's surface, providing dense spatial samples of the seismic wavefield. The aim of our study is to analyse these records with statistical array-based approaches to identify coherent time-series as a function of time and frequency. Using ideas mainly brought from the random matrix theory, we analyse the spatial coherence of the seismic wavefield from the width of the covariance matrix eigenvalue distribution. We propose a robust detection method that could be used for the analysis of weak and emergent signals embedded in background noise, such as the volcanic or tectonic tremors and local microseismicity, without any prior knowledge about the studied wavefields. We apply our algorithm to the records of the seismic monitoring network of the Piton de la Fournaise volcano located at La Réunion Island and composed of 21 receivers with an aperture of ˜15 km. This array recorded many teleseismic earthquakes as well as seismovolcanic events during the year 2010. We show that the analysis of the wavefield at frequencies smaller than ˜0.1 Hz results in detection of the majority of teleseismic events from the Global Centroid Moment Tensor database. The seismic activity related to the Piton de la Fournaise volcano is well detected at frequencies above 1 Hz.

  20. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY TC; RINKER MW; CARPENTER BG; HENDRIX C; ABATT FG

    2009-01-15

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Analyses. The original scope of the project was to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Although Milestone M-48-14 has been met, Revision I is being issued to address external review comments with emphasis on changes in the modeling of anchor bolts connecting the concrete dome and the steel primary tank. The work statement provided to M&D (PNNL 2003) required that a nonlinear soil structure interaction (SSI) analysis be performed on the DSTs. The analysis is required to include the effects of sliding interfaces and fluid sloshing (fluid-structure interaction). SSI analysis has traditionally been treated by frequency domain computer codes such as SHAKE (Schnabel, et al. 1972) and SASSI (Lysmer et al. 1999a). Such frequency domain programs are limited to the analysis of linear systems. Because of the contact surfaces, the response of the DSTs to a seismic event is inherently nonlinear and consequently outside the range of applicability of the linear frequency domain programs. That is, the nonlinear response of the DSTs to seismic excitation requires the use of a time domain code. The capabilities and limitations of the commercial time domain codes ANSYS{reg_sign} and MSC Dytran{reg_sign} for performing seismic SSI analysis of the DSTs and the methodology required to perform the detailed seismic analysis of the DSTs has been addressed in Rinker et al (2006a). On the basis of the results reported in Rinker et al

  1. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    PubMed

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework.

  2. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  3. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  4. Seismic refraction analysis: the path forward

    USGS Publications Warehouse

    Haines, Seth S.; Zelt, Colin; Doll, William

    2012-01-01

    Seismic Refraction Methods: Unleashing the Potential and Understanding the Limitations; Tucson, Arizona, 29 March 2012 A workshop focused on seismic refraction methods took place on 29 May 2012, associated with the 2012 Symposium on the Application of Geophysics to Engineering and Environmental Problems. This workshop was convened to assess the current state of the science and discuss paths forward, with a primary focus on near-surface problems but with an eye on all applications. The agenda included talks on these topics from a number of experts interspersed with discussion and a dedicated discussion period to finish the day. Discussion proved lively at times, and workshop participants delved into many topics central to seismic refraction work.

  5. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  6. Information Assurance Technology AnaLysis Center. Information Assurance Tools Report. Vulnerability Analysis

    DTIC Science & Technology

    1998-01-01

    Information Assurance Tools Report Vulnerability Analysis 6. AUTHOR(S) IATAC 5 . FUNDING NUMBERS SPO700-97-R-0603 7. PERFORMING ORGANIZATION NAME(S...Collection 3 Tool Classification 3 Tool Sources 3 Database Structure 5 Tool Selection Criteria 5 Results 5 Summary of Vulnerability Analysis Tools 6...www.giga.or.at/pub/ hacker/unix BALLISTA TITLE Ballista AUTHOR Secure Networks Inc. SOURCE http://www.secnet.com/ nav1 b.html KEYWORDS comprehensive

  7. Seismic analysis for translational failure of landfills with retaining walls.

    PubMed

    Feng, Shi-Jin; Gao, Li-Ya

    2010-11-01

    In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method.

  8. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  9. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  10. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  11. A new passive seismic method based on seismic interferometry and multichannel analysis of surface waves

    NASA Astrophysics Data System (ADS)

    Cheng, Feng; Xia, Jianghai; Xu, Yixian; Xu, Zongbo; Pan, Yudi

    2015-06-01

    We proposed a new passive seismic method (PSM) based on seismic interferometry and multichannel analysis of surface waves (MASW) to meet the demand for increasing investigation depth by acquiring surface-wave data at a low-frequency range (1 Hz ≤ f ≤ 10 Hz). We utilize seismic interferometry to sort common virtual source gathers (CVSGs) from ambient noise and analyze obtained CVSGs to construct 2D shear-wave velocity (Vs) map using the MASW. Standard ambient noise processing procedures were applied to the computation of cross-correlations. To enhance signal to noise ratio (SNR) of the empirical Green's functions, a new weighted stacking method was implemented. In addition, we proposed a bidirectional shot mode based on the virtual source method to sort CVSGs repeatedly. The PSM was applied to two field data examples. For the test along Han River levee, the results of PSM were compared with the improved roadside passive MASW and spatial autocorrelation method (SPAC). For test in the Western Junggar Basin, PSM was applied to a 70 km long linear survey array with a prominent directional urban noise source and a 60 km-long Vs profile with 1.5 km in depth was mapped. Further, a comparison about the dispersion measurements was made between PSM and frequency-time analysis (FTAN) technique to assess the accuracy of PSM. These examples and comparisons demonstrated that this new method is efficient, flexible, and capable to study near-surface velocity structures based on seismic ambient noise.

  12. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY, T.C.

    2006-03-17

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratory (PNNL) to perform seismic analysis of the Hanford Site double-shell tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project--DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST system at Hanford in support of Tri-Party Agreement Milestone M-48-14, The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The work statement provided to M&D (PNNL 2003) required that the seismic analysis of the DSTs assess the impacts of potentially non-conservative assumptions in previous analyses and account for the additional soil mass due to the as-found soil density increase, the effects of material degradation, additional thermal profiles applied to the full structure including the soil-structure response with the footings, the non-rigid (low frequency) response of the tank roof, the asymmetric seismic-induced soil loading, the structural discontinuity between the concrete tank wall and the support footing and the sloshing of the tank waste. The seismic analysis considers the interaction of the tank with the surrounding soil and the effects of the primary tank contents. The DSTs and the surrounding soil are modeled as a system of finite elements. The depth and width of the soil incorporated into the analysis model are sufficient to obtain appropriately accurate analytical results. The analyses required to support the work statement differ from previous analysis of the DSTs in that the soil-structure interaction (SSI) model includes several (nonlinear) contact surfaces in the tank structure, and the contained waste must be modeled explicitly in order to capture the fluid-structure interaction behavior between the primary tank and contained

  13. Weighted network analysis of earthquake seismic data

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhijit; Mukherjee, G.; Manna, S. S.

    2015-09-01

    Three different earthquake seismic data sets are used to construct the earthquake networks following the prescriptions of Abe and Suzuki (2004). It has been observed that different links of this network appear with highly different strengths. This prompted us to extend the study of earthquake networks by considering it as the weighted network. Different properties of such weighted network have been found to be quite different from those of their un-weighted counterparts.

  14. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  15. A robust polynomial principal component analysis for seismic noise attenuation

    NASA Astrophysics Data System (ADS)

    Wang, Yuchen; Lu, Wenkai; Wang, Benfeng; Liu, Lei

    2016-12-01

    Random and coherent noise attenuation is a significant aspect of seismic data processing, especially for pre-stack seismic data flattened by normal moveout correction or migration. Signal extraction is widely used for pre-stack seismic noise attenuation. Principle component analysis (PCA), one of the multi-channel filters, is a common tool to extract seismic signals, which can be realized by singular value decomposition (SVD). However, when applying the traditional PCA filter to seismic signal extraction, the result is unsatisfactory with some artifacts when the seismic data is contaminated by random and coherent noise. In order to directly extract the desired signal and fix those artifacts at the same time, we take into consideration the amplitude variation with offset (AVO) property and thus propose a robust polynomial PCA algorithm. In this algorithm, a polynomial constraint is used to optimize the coefficient matrix. In order to simplify this complicated problem, a series of sub-optimal problems are designed and solved iteratively. After that, the random and coherent noise can be effectively attenuated simultaneously. Applications on synthetic and real data sets note that our proposed algorithm can better suppress random and coherent noise and have a better performance on protecting the desired signals, compared with the local polynomial fitting, conventional PCA and a L1-norm based PCA method.

  16. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  17. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  18. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    SciTech Connect

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara; Vignoli, Andrea

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  19. Development and implementation of a GIS-based tool for spatial modeling of seismic vulnerability of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, M.; Alesheikh, A. A.

    2012-12-01

    Achieving sustainable development in countries prone to earthquakes is possible with taking effective measures to reduce vulnerability to earthquakes. In this context, damage assessment of hypothetical earthquakes and planning for disaster management are important issues. Having a computer tool capable of estimating structural and human losses from earthquakes in a specific region may facilitate the decision-making process before and during disasters. Interoperability of this tool with wide-spread spatial analysis frameworks will expedite the data transferring process. In this study, the earthquake damage assessment (EDA) software tool is developed as an embedded extension within a GIS (geographic information system) environment for the city of Tehran, Iran. This GIS-based extension provides users with a familiar environment to estimate and observe the probable damages and fatalities of a deterministic earthquake scenario. The productivity of this tool is later demonstrated for southern Karoon parish, Region 10, Tehran. Three case studies for three active faults in the area and a comparison of the results with other research substantiated the reliability of this tool for additional earthquake scenarios.

  20. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  1. A power flow based model for the analysis of vulnerability in power networks

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoyang; Chen, Guo; Hill, David J.; Dong, Zhao Yang

    2016-10-01

    An innovative model which considers power flow, one of the most important characteristics in a power system, is proposed for the analysis of power grid vulnerability. Moreover, based on the complex network theory and the Max-Flow theorem, a new vulnerability index is presented to identify the vulnerable lines in a power grid. In addition, comparative simulations between the power flow based model and existing models are investigated on the IEEE 118-bus system. The simulation results demonstrate that the proposed model and the index are more effective in power grid vulnerability analysis.

  2. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    SciTech Connect

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  3. Analysis and Defense of Vulnerabilities in Binary Code

    DTIC Science & Technology

    2008-09-29

    additional web browser vulnerabilities. The client-end web - browser vulnerabilities come from the Month of Browser Bugs (MOBB) website [7]. These bugs...program written as part of a web page. Modern web - browsers allow for various scripting languages, such as JScript, JavaScript, and VBScript. Scripting...languages extend basic HTML with the ability to call native methods on the browser’s computer, e.g., ActiveX controls. When the web - browser renders a

  4. Principal Component Analysis for pattern recognition in volcano seismic spectra

    NASA Astrophysics Data System (ADS)

    Unglert, Katharina; Jellinek, A. Mark

    2016-04-01

    Variations in the spectral content of volcano seismicity can relate to changes in volcanic activity. Low-frequency seismic signals often precede or accompany volcanic eruptions. However, they are commonly manually identified in spectra or spectrograms, and their definition in spectral space differs from one volcanic setting to the next. Increasingly long time series of monitoring data at volcano observatories require automated tools to facilitate rapid processing and aid with pattern identification related to impending eruptions. Furthermore, knowledge transfer between volcanic settings is difficult if the methods to identify and analyze the characteristics of seismic signals differ. To address these challenges we have developed a pattern recognition technique based on a combination of Principal Component Analysis and hierarchical clustering applied to volcano seismic spectra. This technique can be used to characterize the dominant spectral components of volcano seismicity without the need for any a priori knowledge of different signal classes. Preliminary results from applying our method to volcanic tremor from a range of volcanoes including K¯ı lauea, Okmok, Pavlof, and Redoubt suggest that spectral patterns from K¯ı lauea and Okmok are similar, whereas at Pavlof and Redoubt spectra have their own, distinct patterns.

  5. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  6. Elastic structure and seismicity of Donegal (Ireland): insights from passive seismic analysis

    NASA Astrophysics Data System (ADS)

    Piana Agostinetti, Nicola

    2016-04-01

    Ireland's crust is the result of a complex geological history, which began in the Palaeozoic with the oblique closure of the Iapetus Ocean and, probably, it is still on-going. In the northwestern portion of the island, the geology of Donegal has been the subject of detailed geological investigation by many workers in the last century. The most widely represented rock types in Donegal are metasediments of Dalradian and Moinian age, invaded by several granites of Caledonian age (so called Donegal granite). Smaller and separate intrusions are present (e.g. Fanad Head). On the contrary, it is widely accepted that the the deep crustal structure of the northern portion of Ireland has been re-worked in more recent time. The several phases of lithospheric stretching associated to the opening of the Atlantic ocean interested such portion of Ireland, with the extrusion of flood basalts. Moreover, the presence of a hot, low-density asthenospheric plume spreading from Iceland has been suggested, with the formation of a thick high-velocity layer of magmatic underplated material at the base of the crust. Oddly, at present, Donegal is the only seismically active area in Ireland, with an average rate of one Mw=2-3 event every 3-4 years. In the last three years, passive seismic data have been recorded at 12 seismic stations deployed across the most seismically active area in Co. Donegal, with the aim of reconstructing the seismic structure down to the upper-mantle depth and of locating the microseismic activity within investigating volume. Both local and teleseismic events were recorded giving the opportunity of integrating results form different techniques for seismic data analysis, and jointly interpret them together with surface geology and mapped fault traces. Local events have been used to define constrain faulting volumes, focal mechanisms and to reconstruct a low-resolution 3D Vp and VpVs velocity models. Teleseismic events have been used to compute receiver function data

  7. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    SciTech Connect

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  8. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  9. Subsalt risk reduction using seismic sequence stratigraphic analysis

    SciTech Connect

    Wornardt, W.W. Jr.

    1994-12-31

    Several recent projects involving detailed seismic sequence stratigraphic analysis of existing wells near subsalt prospects in the south additions of the offshore Louisiana area in the Gulf of Mexico have demonstrated the utility of using seismic sequence stratigraphic analysis to reduce risk when drilling subsalt plays. First, the thick section of sedimentary rocks that was though to be above and below the salt was penetrated in the area away from the salt. These sedimentary rocks were accurately dated using maximum flooding surface first occurrence downhole of important bioevent, condensed sections, abundance and diversity histograms, and high-resolution biostratigraphy while the wells were being drilled. Potential reservoir sandstones within specific Vail sequences in these wells were projected using seismic data up to the subsalt and non-subsalt sediment interface. The systems tract above and below the maximum flooding surface and the type of reservoir sandstones that were to be encounterd were predictable based on the paleobathymetry, increase and decrease of fauna and flora, recognition of the bottom-set turbidite, slope fan and basin floor fan condensed sections, and superpositional relationship of the Vail sequences and systems tracts to provide a detailed sequence stratigraphic analysis of the well. Subsequently, wells drilled through the salt could be accurately correlated with Vail sequences and systems tracts in wells that were previously correlated away from the salt layer with seismic reflection profiles.

  10. Subsalt risk reduction using seismic sequence-stratigraphic analysis

    SciTech Connect

    Wornardt, W.W. Jr.

    1994-09-01

    Several recent projects involving detailed seismic-sequence stratigraphic analysis of existing wells near subsalt prospects in the south additions of the offshore Louisiana area in the Gulf of Mexico have demonstrated the utility of using seismic sequence-stratigraphic analysis to reduce risk when drilling subsalt plays. First, the thick section of sediments that was thought to be above and below the salt was penetrated in the area away from the salt. These sediments were accurately dated using maximum flooding surface first occurrence downhole of important bioevent, condensed sections, abundance and diversity histograms, and high-resolution biostratigraphy while the wells were being drilled. Potential reservoir sands within specific Vail sequences in these wells were projected on seismic up to the subsalt and non-subsalt sediment interface. The systems tract above and below the maximum flooding surface and the type of reservoir sands that were to be encountered were predictable based on the paleobathymetry, increase and decrease of fauna and flora abundance, recognition of the bottom-set turbidite, slope fan and basin floor fan condensed sections, and superpositional relationship of the Vail sequences and systems tracts to provide a detailed sequence-stratigraphic analysis of the well in question. Subsequently, the wells drilled through the salt could be accurately correlated with the Vail sequences and systems tracts in wells that were previously correlated with seismic reflection profiles away from the salt layer.

  11. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    NASA Astrophysics Data System (ADS)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  12. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  13. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  14. Real Option Cost Vulnerability Analysis of Electrical Infrastructure

    NASA Astrophysics Data System (ADS)

    Prime, Thomas; Knight, Phil

    2015-04-01

    Critical infrastructure such as electricity substations are vulnerable to various geo-hazards that arise from climate change. These geo-hazards range from increased vegetation growth to increased temperatures and flood inundation. Of all the identified geo-hazards, coastal flooding has the greatest impact, but to date has had a low probability of occurring. However, in the face of climate change, coastal flooding is likely to occur more often due to extreme water levels being experienced more frequently due to sea-level rise (SLR). Knowing what impact coastal flooding will have now and in the future on critical infrastructure such as electrical substations is important for long-term management. Using a flood inundation model, present day and future flood events have been simulated, from 1 in 1 year events up to 1 in 10,000 year events. The modelling makes an integrated assessment of impact by using sea-level and surge to simulate a storm tide. The geographical area the model covers is part of the Northwest UK coastline with a range of urban and rural areas. The ensemble of flood maps generated allows the identification of critical infrastructure exposed to coastal flooding. Vulnerability has be assessed using an Estimated Annual Damage (EAD) value. Sampling SLR annual probability distributions produces a projected "pathway" for SLR up to 2100. EAD is then calculated using a relationship derived from the flood model. Repeating the sampling process allows a distribution of EAD up to 2100 to be produced. These values are discounted to present day values using an appropriate discount rate. If the cost of building and maintain defences is also removed from this a Net Present Value (NPV) of building the defences can be calculated. This distribution of NPV can be used as part of a cost modelling process involving Real Options, A real option is the right but not obligation to undertake investment decisions. In terms of investment in critical infrastructure resilience this

  15. The application of seismic risk-benefit analysis to land use planning in Taipei City.

    PubMed

    Hung, Hung-Chih; Chen, Liang-Chun

    2007-09-01

    In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.

  16. An integrated analysis of controlled- and passive source seismic data

    NASA Astrophysics Data System (ADS)

    Rumpfhuber, Eva-Maria

    This dissertation consists of two parts, which include a study using passive source seismic data, and one using the dataset from a large-scale refraction/wide-angle reflection seismic experiment as the basis for an integrated analysis. The goal of the dissertation is the integration of the two different datasets and a combined interpretation of the results of the "Continental Dynamics of the Rocky Mountains" (CD-ROM) 1999 seismic experiment. I have determined the crustal structure using four different receiver function methods using data collected from the northern transect of the CD-ROM passive seismic experiment. The resulting migrated image and crustal thickness determinations confirm and define prior crustal thickness measurements based on the CD-ROM and Deep Probe datasets. The new results show a very strong lower crustal layer (LCL) with variable thickness beneath the Wyoming Province. In addition, I was able to show that it terminates at 42° latitude and provide a seismic tie between the CD-ROM and Deep Probe seismic experiments so they represent a continuous N-S transect extending from New Mexico into Alberta, Canada. This new tie is particularly important because it occurs close to a major tectonic boundary, the Cheyenne belt, between an Archean craton and a Proterozoic terrane. The controlled-source seismic dataset was analyzed with the aid of forward modeling and inversion to establish a two-dimensional velocity and interface model of the area. I have developed a picking strategy, which helps identify the seismic phases, and improves quality and quantity of the picks. In addition, I was able to pick and identify S-wave phases, which furthermore allowed me to establish an independent S-wave model, and hence the Poisson's and Vp/Vs ratios. The final velocity and interface model was compared to prior results, and the results were jointly interpreted with the receiver function results. Thanks to the integration of the controlled-source and receiver function

  17. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  18. Seismic performance analysis of Tendaho earth fill dam, Ethiopia.

    NASA Astrophysics Data System (ADS)

    Berhe, T.; Wu, W.

    2009-04-01

    The Tendaho dam is found in the Afar regional state, North Eastern part of Ethiopia. It is located within an area known as the ‘Tendaho Graben' ,which forms the center of Afar triangle, a low lying area of land where East African, Red sea and the Gulf of Eden Rift systems converge. The dam is an earthfill dam with a volume of about 4 Million cubic meters and with mixed clay core. The geological setting associated with the site of the dam, the geotechnical properties of the dam materials and seismicity of the region are reviewed. Based on this review, the foundation materials and dam body include some liquefiable granular soils. Moreover, the active East African Rift Valley fault, which can generate an earthquake of magnitude greater than 6, passes through the dam body. This valley is the primary seismic source contributing to the hazard at the Tendaho dam site. The availability of liquefiable materials beneath and within the dam body and the presence of the active fault crossing the dam site demand a thorough seismic analysis of the dam. The peak ground acceleration (PGA) is selected as a measure of ground motion severity. The PGA was selected according to the guidelines of the International Commission on Large Dams, ICOLD. Based on the criteria set by the ICOLD, the dam is analyzed for two different earthquake magnitudes, the Maximum Credible Earthquake (MCE) and the Operating Basis Earthquake (OBE). Numerical codes are useful tools to investigate the safety of dams in seismic prone areas. In this paper, FLAC3D numerical tool is used to investigate the performance of the dam under dynamic loading. Based on the numerical analysis, the seismic performance of the dam is investigated.

  19. Probabilistic seismic demand analysis using advanced ground motion intensity measures

    USGS Publications Warehouse

    Tothong, P.; Luco, N.

    2007-01-01

    One of the objectives in performance-based earthquake engineering is to quantify the seismic reliability of a structure at a site. For that purpose, probabilistic seismic demand analysis (PSDA) is used as a tool to estimate the mean annual frequency of exceeding a specified value of a structural demand parameter (e.g. interstorey drift). This paper compares and contrasts the use, in PSDA, of certain advanced scalar versus vector and conventional scalar ground motion intensity measures (IMs). One of the benefits of using a well-chosen IM is that more accurate evaluations of seismic performance are achieved without the need to perform detailed ground motion record selection for the nonlinear dynamic structural analyses involved in PSDA (e.g. record selection with respect to seismic parameters such as earthquake magnitude, source-to-site distance, and ground motion epsilon). For structural demands that are dominated by a first mode of vibration, using inelastic spectral displacement (Sdi) can be advantageous relative to the conventionally used elastic spectral acceleration (Sa) and the vector IM consisting of Sa and epsilon (??). This paper demonstrates that this is true for ordinary and for near-source pulse-like earthquake records. The latter ground motions cannot be adequately characterized by either Sa alone or the vector of Sa and ??. For structural demands with significant higher-mode contributions (under either of the two types of ground motions), even Sdi (alone) is not sufficient, so an advanced scalar IM that additionally incorporates higher modes is used.

  20. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  1. [Human vulnerability under cosmetic surgery. A bioethic analysis].

    PubMed

    Ramos-Rocha de Viesca, Mariablanca

    2012-01-01

    Cosmetic surgery is one of the best examples of the current health empowerment. Aesthetic surgical interventions have been criticized because they expose the healthy individual to an unnecessary risk. In modern society the body has turned into a beauty depository with a commercial value. In published bioethics papers, analyses of the cosmetic problem pointed their attention on the freedom, autonomy and distributive justice. Mexico occupies fifth place in the world of cosmetic surgeries. Vulnerability is an inherent condition of man's existence and marks the limit of human dignity. UNESCO agrees that some populations are more inclined to vulnerability. The aim of this work is to demonstrate that those who wish to make a physical change had given up to social coercion and psychological problems.

  2. Seismic and hydroacoustic analysis relevant to MH370

    SciTech Connect

    Stead, Richard J.

    2014-07-03

    The vicinity of the Indian Ocean is searched for open and readily available seismic and/or hydroacoustic stations that might have recorded a possible impact of MH370 with the ocean surface. Only three stations are identified: the IMS hydrophone arrays H01 and H08, and the Geoscope seismic station AIS. Analysis of the data from these stations shows an interesting arrival on H01 that has some interference from an Antarctic ice event, large amplitude repeating signals at H08 that obscure any possible arrivals, and large amplitude chaotic noise at AIS precludes any analysis at higher frequencies of interest. The results are therefore rather inconclusive but may point to a more southerly impact location within the overall Indian Ocean search region. The results would be more useful if they can be combined with any other data that are not readily available.

  3. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  4. Noise analysis of the seismic system employed in the northern and southern California seismic nets

    USGS Publications Warehouse

    Eaton, J.P.

    1984-01-01

    The seismic networks have been designed and operated to support recording on Develocorders (less than 40db dynamic range) and analog magnetic tape (about 50 db dynamic range). The principal analysis of the records has been based on Develocorder films; and background earth noise levels have been adjusted to be about 1 to 2 mm p-p on the film readers. Since the traces are separated by only 10 to 12 mm on the reader screen, they become hopelessly tangled when signal amplitudes on several adjacent traces exceed 10 to 20 mm p-p. Thus, the background noise level is hardly more than 20 db below the level of largest readable signals. The situation is somewhat better on tape playbacks, but the high level of background noise set to accomodate processing from film records effectively limits the range of maximum-signal to background-earth-noise on high gain channels to a little more than 30 db. Introduction of the PDP 11/44 seismic data acquisition system has increased the potential dynamic range of recorded network signals to more than 60 db. To make use of this increased dynamic range we must evaluate the characteristics and performance of the seismic system. In particular, we must determine whether the electronic noise in the system is or can be made sufficiently low so that background earth noise levels can be lowered significantly to take advantage of the increased dynamic range of the digital recording system. To come to grips with the complex problem of system noise, we have carried out a number of measurements and experiments to evaluate critical components of the system as well as to determine the noise characteristics of the system as a whole.

  5. US Industrial Base Dependence/Vulnerability. Phase 2. Analysis

    DTIC Science & Technology

    1987-11-01

    assessing the degree of foreign vulnerability, and 3) suggesting some methods for dealing with identified foreignvulnerabilities. The Nature of the...on recovery. It is the last method —creating a buffer stock—which this paper uses as an estimate of the upper bound of the cost of Insuring...supplies. Similar methods were used for other problem solutions where appropriate. FIndIngs Although fore Ign-sourced parts constitute only one to

  6. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  7. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  8. Policies on Protecting Vulnerable People During Disasters in Iran: A Document Analysis

    PubMed Central

    Abbasi Dolatabadi, Zahra; Seyedin, Hesam; Aryankhesal, Aidin

    2016-01-01

    Context Developing official protection policies for disasters is a main strategy in protecting vulnerable people. The aim of this study was to analyze official documents concerning policies on protecting vulnerable people during disasters. Evidence Acquisition This study was conducted by the qualitative document analysis method. Documents were gathered by searching websites and referring to the organizations involved in disaster management. The documents were assessed by a researcher-made data collection form. A directed content analysis approach was used to analyze the retrieved documents regarding the protection policies and legislation for vulnerable people. Results A total of 22 documents were included in the final analysis. Most of the documents referred to women, children, elderly people, poor, and villagers as vulnerable people. Moreover, the documents did not provide information regarding official measures for protecting vulnerable people during different phases of disaster management. Conclusions A clear and comprehensive definition of “vulnerable people” and formulation of official policies to protect them is needs to be formulated. Given the high prevalence of disasters in Iran, policy makers need to develop effective context-based policies to protect vulnerable people during disasters. PMID:27921019

  9. Four-dimensional seismic analysis of the Hibernia oil field, Grand Banks, Canada

    NASA Astrophysics Data System (ADS)

    Wright, Richard James

    2004-12-01

    The seismic reflection method, traditionally a geologic structural imaging tool, is increasingly being utilized for petroleum reservoir monitoring purposes. Time-lapse, or four dimensional (4D) seismic reservoir monitoring is the process by which repeated 3D seismic surveys are acquired over a common area during the production of a petroleum reservoir in an effort to spatially image production related changes. While if successful, this seismic method can have a significant impact on an oil field's development plan, the sometimes subtle nature of the 4D seismic signals restricts the universal application of 4D seismic methods in all reservoirs and operating environments. To examine the potential use of 4D seismic on Canada's Grand Banks, this thesis conducts a 4D seismic analysis of the Hibernia oil field---the first example of 4D seismic technology on the Grand Banks. Due to a challenging environment (seismic and reservoir) at Hibernia for 4D seismic success, rock physics modeling predicts a subtle 4D seismic response for areas of both water and gas injection. To equalize the 4D seismic datasets, specialized poststack cross equalization including a volume event warping process is applied to two 3D post stack seismic datasets from the Hibernia oil field, a pre-production "legacy" survey acquired in 1991, and a 2001 survey. The cross equalization processing improves the repeatability of non-reservoir events fieldwide and enhances reservoir anomalies in some areas of the field. While the data contains a fair degree of noise, 4D seismic anomalies above the noise level can be imaged in areas of both water and gas injection. Through interpretation, some of these anomalies are shown to be consistent with modeled responses to water and gas injection. In addition, there is evidence that some of the seismic anomalies may be due to pore pressure changes in the reservoir. The results of the Hibernia 4D seismic analysis are then used as background for a feasibility analysis for

  10. EBR-2 (Experimental Breeder Reactor-2) containment seismic analysis

    SciTech Connect

    Gale, J.G.; Lehto, W.K.

    1990-01-01

    The Experimental Breeder Reactor-2 (EBR-2) is a liquid metal reactor located at the Argonne National Laboratory near Idaho Falls, Idaho. At the time the EBR-2 was designed and constructed, there were no engineering society or federal guide lines specifically directed toward the seismic design of reactor containment structures; hence, static analysis techniques were used in the design. With the increased focus on safety of reactor and fuel reprocessing facilities, Argonne has initiated a program to analyze its existing facilities for seismic integrity using current Department of Energy guidelines and industry consensus standards. A seismic analysis of the EBR-2 containment building has been performed using finite-element analysis techniques. The containment building is essentially a vertical right cylindrical steel shell with heads on both ends. The structure is unique in that the interior of the steel shell is lined with reinforced concrete. The actual containment function of the building is served by the steel shell; whereas the function of the concrete liner is to serve as a missile shield and a thermal insulating shield to protect the steel containment shell from internally generated missiles and fires. Model development and structural evaluation of the EBR-2 containment building are discussed in this paper. 7 refs., 8 figs.

  11. Utilizing Semantic Big Data for realizing a National-scale Infrastructure Vulnerability Analysis System

    SciTech Connect

    Chinthavali, Supriya; Shankar, Mallikarjun

    2016-01-01

    Critical Infrastructure systems(CIs) such as energy, water, transportation and communication are highly interconnected and mutually dependent in complex ways. Robust modeling of CIs interconnections is crucial to identify vulnerabilities in the CIs. We present here a national-scale Infrastructure Vulnerability Analysis System (IVAS) vision leveraging Se- mantic Big Data (SBD) tools, Big Data, and Geographical Information Systems (GIS) tools. We survey existing ap- proaches on vulnerability analysis of critical infrastructures and discuss relevant systems and tools aligned with our vi- sion. Next, we present a generic system architecture and discuss challenges including: (1) Constructing and manag- ing a CI network-of-networks graph, (2) Performing analytic operations at scale, and (3) Interactive visualization of ana- lytic output to generate meaningful insights. We argue that this architecture acts as a baseline to realize a national-scale network based vulnerability analysis system.

  12. Seismic fragility analysis of highway bridges considering multi-dimensional performance limit state

    NASA Astrophysics Data System (ADS)

    Wang, Qi'ang; Wu, Ziyan; Liu, Shukui

    2012-03-01

    Fragility analysis for highway bridges has become increasingly important in the risk assessment of highway transportation networks exposed to seismic hazards. This study introduces a methodology to calculate fragility that considers multi-dimensional performance limit state parameters and makes a first attempt to develop fragility curves for a multispan continuous (MSC) concrete girder bridge considering two performance limit state parameters: column ductility and transverse deformation in the abutments. The main purpose of this paper is to show that the performance limit states, which are compared with the seismic response parameters in the calculation of fragility, should be properly modeled as randomly interdependent variables instead of deterministic quantities. The sensitivity of fragility curves is also investigated when the dependency between the limit states is different. The results indicate that the proposed method can be used to describe the vulnerable behavior of bridges which are sensitive to multiple response parameters and that the fragility information generated by this method will be more reliable and likely to be implemented into transportation network loss estimation.

  13. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  14. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    NASA Astrophysics Data System (ADS)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  15. Mapping Upper Mantle Seismic Discontinuities Using Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gu, Y. J.; Dokht, R.; Sacchi, M. D.

    2015-12-01

    Seismic discontinuities are fundamental to the understanding of mantle composition and dynamics. Their depth and impedance are generally determined using secondary seismic phases, most commonly SS precursors and P-to-S converted waves. However, the analysis and interpretation using these approaches often suffer from incomplete data coverage, high noise levels and interfering seismic phases, especially near tectonically complex regions such as subduction zones and continental margins. To overcome these pitfalls, we apply Singular Spectrum Analysis (SSA) to remove random noise, reconstruct missing traces and enhance the robustness of SS precursors and P-to-S conversions from seismic discontinuities. Our method takes advantage of the predictability of time series in frequency-space domain and performs a rank reduction using a singular value decomposition of the trajectory matrix. We apply SSA to synthetic record sections as well as observations of 1) SS precursors beneath the northwestern Pacific subduction zones, and 2) P-to-S converted waves from the Western Canada Sedimentary Basin (WCSB). In comparison with raw or interpolated data, the SSA enhanced reflectivity maps show a greater resolution and a stronger negative correlation between the depths of the 410 and 660 km discontinuities. These effects can be attributed to the suppression of incoherent noise, which tends to reduce the signal amplitude during normal averaging procedures, through rank reduction and the emphasis of principle singular values. Our new results suggest a more laterally coherent 520 km reflection in the western Pacific regions. Similar improvements in data imaging are achieved in western Canada, where strong lateral variations in discontinuity topography are observed in the craton-Cordillera boundary zone. Improvements from SSA relative to conventional approaches are most notable in under-sampled regions.

  16. Understanding North Texas Seismicity: A Joint Analysis of Seismic Data and 3D Pore Pressure Modeling

    NASA Astrophysics Data System (ADS)

    DeShon, H. R.; Hornbach, M. J.; Ellsworth, W. L.; Oldham, H. R.; Hayward, C.; Stump, B. W.; Frohlich, C.; Olson, J. E.; Luetgert, J. H.

    2014-12-01

    In November 2013, a series of earthquakes began along a mapped ancient fault system near Azle, Texas. The Azle events are the third felt earthquake sequence in the Fort Worth (Barnett Shale) Basin since 2008, and several production and injection wells in the area are drilled to depths near the recent seismic activity. Understanding if and/or how injection and removal of fluids in the crystalline crust reactivates faults have important implications for seismology, the energy industry, and society. We assessed whether the Azle earthquakes were induced using a joint analysis of the earthquake data, subsurface geology and fault structure, and 3D pore pressure modeling. Using a 12-station temporary seismic deployment, we have recorded and located >300 events large enough to be recorded on multiple stations and 1000s of events during periods of swarm activity. High-resolution locations and focal mechanisms indicate that events occurred on NE-SW trending, steeply dipping normal faults associated with the southern end of the Newark East Fault Zone with hypocenters between 2-8 km depth. We considered multiple causes that might have changed stress along this system. Earthquakes resulting from natural processes, though perhaps unlikely in this historically inactive region, can be neither ruled out nor confirmed due to lack of information on the natural stress state of these faults. Analysis of lake and groundwater variations near Azle showed that no significant stress changes occurred prior to or during the earthquake sequence. In contrast, analysis of pore-pressure models shows that the combination of formation water production and wastewater injection near the fault could have caused pressure increases that induced earthquakes on near-critically stressed faults.

  17. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    PubMed

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  18. Assessing the Climate Change Vulnerability of Physical Infrastructures through a Spatial Analysis

    NASA Astrophysics Data System (ADS)

    Myeong, S.

    2012-12-01

    Natural hazards can destroy or damage physical infrastructures and thus incur socioeconomic losses and threaten the safety of people. Therefore, identifying the vulnerability of a given society's physical infrastructure to climate change and developing appropriate adaptation measures are necessary. A recent trend of climate change vulnerability assessment has shifted its focus from the index-based assessment to the spatial analysis of the vulnerability to climate change in order to see the distribution of vulnerable areas. Although some research has been conducted on the US and Southwestern Asia, no formal research has been conducted on Korea that assessed the vulnerable areas in terms of spatial distribution. The current study attempts to see what types of vulnerability exist in what areas of the country through an analysis of data gathered from different sectors of Korea. Three domains, i.e., sensitivity, exposure, and adaptive capacity, were investigated, with subordinate component data under each domain, to assess the vulnerability of the country. The results showed that the vulnerability degree differs between coastal areas and inland areas. For most subordinate components, coastal areas were more vulnerable than inland areas. Within the inland areas, less urbanized areas were more sensitive to the climate change than more urbanized areas, while large metropolitan areas were exposed more to the climate change due to the density of physical infrastructures. Some southern areas of the country had greater adaptive capacity economically and institutionally; however, Seoul and its vicinity had greater adaptive capacity related to physical infrastructures. The study concludes that since damages from natural disasters such as floods and typhoons are becoming increasingly serious around the world as well as in Korea, it is necessary to develop appropriate measures for physical infrastructure to adapt to the climate change, customized to the specific needs of different

  19. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES.

    SciTech Connect

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. .

    2005-07-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice.

  20. Seismic vulnerability assessment of a steel-girder highway bridge equipped with different SMA wire-based smart elastomeric isolators

    NASA Astrophysics Data System (ADS)

    Hedayati Dezfuli, Farshad; Shahria Alam, M.

    2016-07-01

    Shape memory alloy wire-based rubber bearings (SMA-RBs) possess enhanced energy dissipation capacity and self-centering property compared to conventional RBs. The performance of different types of SMA-RBs with different wire configurations has been studied in detail. However, their reliability in isolating structures has not been thoroughly investigated. The objective of this study is to analytically explore the effect of SMA-RBs on the seismic fragility of a highway bridge. Steel-reinforced elastomeric isolators are equipped with SMA wires and used to isolate the bridge. Results revealed that SMA wires with a superelastic behavior and re-centering capability can increase the reliability of the bearing and the bridge structure. It was observed that at the collapse level of damage, the bridge isolated by SMA-HDRB has the lowest fragility. Findings also showed that equipping NRB with SMA wires decreases the possibility of damage in the bridge while, replacing HDRB with SMA-HDRB; or LRB with SMA-LRB increases the failure probability of the system at slight, moderate, and extensive limit states.

  1. Seismic Fragility Analysis of a Degraded Condensate Storage Tank

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C.; Choun, Y-S.; Kim, M.K.; Choi, I-K.

    2011-05-16

    The Korea Atomic Energy Research Institute (KAERI) and Brookhaven National Laboratory are conducting a collaborative research project to develop seismic capability evaluation technology for degraded structures and components in nuclear power plants (NPPs). One of the goals of this collaboration endeavor is to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The essential part of this collaboration is aimed at achieving a better understanding of the effects of aging on the performance of SSCs and ultimately on the safety of NPPs. A recent search of the degradation occurrences of structures and passive components (SPCs) showed that the rate of aging related degradation in NPPs was not significantly large but increasing, as the plants get older. The slow but increasing rate of degradation of SPCs can potentially affect the safety of the older plants and become an important factor in decision making in the current trend of extending the operating license period of the plants (e.g., in the U.S. from 40 years to 60 years, and even potentially to 80 years). The condition and performance of major aged NPP structures such as the containment contributes to the life span of a plant. A frequent misconception of such low degradation rate of SPCs is that such degradation may not pose significant risk to plant safety. However, under low probability high consequence initiating events, such as large earthquakes, SPCs that have slowly degraded over many years could potentially affect plant safety and these effects need to be better understood. As part of the KAERI-BNL collaboration, a condensate storage tank (CST) was analyzed to estimate its seismic fragility capacities under various postulated degradation scenarios. CSTs were shown to have a significant impact on the seismic core damage frequency of a nuclear power plant. The seismic fragility capacity of the CST was developed

  2. Seismic margin review of the Maine Yankee Atomic Power Station: Fragility analysis

    SciTech Connect

    Ravindra, M. K.; Hardy, G. S.; Hashimoto, P. S.; Griffin, M. J.

    1987-03-01

    This Fragility Analysis is the third of three volumes for the Seismic Margin Review of the Maine Yankee Atomic Power Station. Volume 1 is the Summary Report of the first trial seismic margin review. Volume 2, Systems Analysis, documents the results of the systems screening for the review. The three volumes are part of the Seismic Margins Program initiated in 1984 by the Nuclear Regulatory Commission (NRC) to quantify seismic margins at nuclear power plants. The overall objectives of the trial review are to assess the seismic margins of a particular pressurized water reactor, and to test the adequacy of this review approach, quantification techniques, and guidelines for performing the review. Results from the trial review will be used to revise the seismic margin methodology and guidelines so that the NRC and industry can readily apply them to assess the inherent quantitative seismic capacity of nuclear power plants.

  3. Assessing the Performance of a Classification-Based Vulnerability Analysis Model.

    PubMed

    Wang, Tai-ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2015-09-01

    In this article, a classification model based on the majority rule sorting (MR-Sort) method is employed to evaluate the vulnerability of safety-critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited-size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the classification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment of the performance of the classification model (in terms of accuracy and confidence in the assignments) is thus in order. Three different app oaches are here considered to this aim: (i) a model-retrieval-based approach, (ii) the bootstrap method, and (iii) the leave-one-out cross-validation technique. The analyses are presented with reference to an exemplificative case study involving the vulnerability assessment of nuclear power plants.

  4. Livelihood security, vulnerability and resilience: a historical analysis of Chibuene, southern Mozambique.

    PubMed

    Ekblom, Anneli

    2012-07-01

    A sustainable livelihood framework is used to analyse livelihood security, vulnerability and resilience in the village of Chibuene, Vilanculos, southern Mozambique from a historical and contemporary perspective. Interviews, assessments, archaeology, palaeoecology and written sources are used to address tangible and intangible aspects of livelihood security. The analysis shows that livelihood strategies for building resilience, diversification of resource use, social networks and trade, have long historical continuities. Vulnerability is contingent on historical processes as long-term socio-environmental insecurity and resultant biodiversity loss. These contingencies affect the social capacity to cope with vulnerability in the present. The study concludes that contingency and the extent and strength of social networks should be added as a factor in livelihood assessments. Furthermore, policies for mitigating vulnerability must build on the reality of environmental insecurity, and strengthen local structures that diversify and spread risk.

  5. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2013-10-01

    Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three main different

  6. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  7. Kinematic Seismic Rupture Parameters from a Doppler Analysis

    NASA Astrophysics Data System (ADS)

    Caldeira, Bento; Bezzeghoud, Mourad; Borges, José F.

    2010-05-01

    The radiation emitted from extended seismic sources, mainly when the rupture spreads in preferred directions, presents spectral deviations as a function of the observation location. This aspect, unobserved to point sources, and named as directivity, are manifested by an increase in the frequency and amplitude of seismic waves when the rupture occurs in the direction of the seismic station and a decrease in the frequency and amplitude if it occurs in the opposite direction. The model of directivity that supports the method is a Doppler analysis based on a kinematic source model of rupture and wave propagation through a structural medium with spherical symmetry [1]. A unilateral rupture can be viewed as a sequence of shocks produced along certain paths on the fault. According this model, the seismic record at any point on the Earth's surface contains a signature of the rupture process that originated the recorded waveform. Calculating the rupture direction and velocity by a general Doppler equation, - the goal of this work - using a dataset of common time-delays read from waveforms recorded at different distances around the epicenter, requires the normalization of measures to a standard value of slowness. This normalization involves a non-linear inversion that we solve numerically using an iterative least-squares approach. The evaluation of the performance of this technique was done through a set of synthetic and real applications. We present the application of the method at four real case studies, the following earthquakes: Arequipa, Peru (Mw = 8.4, June 23, 2001); Denali, AK, USA (Mw = 7.8; November 3, 2002); Zemmouri-Boumerdes, Algeria (Mw = 6.8, May 21, 2003); and Sumatra, Indonesia (Mw = 9.3, December 26, 2004). The results obtained from the dataset of the four earthquakes agreed, in general, with the values presented by other authors using different methods and data. [1] Caldeira B., Bezzeghoud M, Borges JF, 2009; DIRDOP: a directivity approach to determining

  8. Detection, Measurement, Visualization, and Analysis of Seismic Crustal Deformation

    NASA Technical Reports Server (NTRS)

    Crippen, R.; Blom, R.

    1995-01-01

    Remote sensing plays a key role in the analysis of seismic crustal deformation. Recently radar interferometry has been used to measure one dimension of the strain fields of earthquakes at a resolution of centimeters. Optical imagery is useful in measuring the strain fields in both geographic dimensions of the strain field down to 1/20 of pixel size, and soon will be capable of high resolution. Visual observation of fault motion from space can also be used to detect fault motion from aerial photographs.

  9. Analysis of Vulnerability Around The Colima Volcano, MEXICO

    NASA Astrophysics Data System (ADS)

    Carlos, S. P.

    2001-12-01

    The Colima volcano located in the western of the Trasmexican Volcanic Belt, in the central portion of the Colima Rift Zone, between the Mexican States of Jalisco and Colima. The volcano since January of 1998 presents a new activity, which has been characterized by two stages: the first one was an effusive phase that begin on 20 November 1998 and finish by the middle of January 1999. On February 10of 1999 a great explosion in the summit marked the beginning of an explosive phase, these facts implies that the eruptive process changes from an effusive model to an explosive one. Suárez-Plascencia et al, 2000, present hazard maps to ballistic projectiles, ashfalls and lahars for this scenario. This work presents the evaluation of the vulnerability in the areas identified as hazardous in the maps for ballistic, ashfalls and lahars, based on the economic elements located in the middle and lower sections of the volcano building, like agriculture, forestry, agroindustries and communication lines (highways, power, telephonic, railroad, etc). The method is based in Geographic Information Systems, using digital cartography scale 1:50,000, digital orthophotos from the Instituto Nacional de Estadística, Geografía e Informática, SPOT and Landsat satellite images from 1997 and 2000 in the bands 1, 2 and 3. The land use maps obtained for 1997 and 2000, were compared with the land use map reported by Suárez in 1992, from these maps an increase of the 5 porcent of the sugar cane area and corn cultivations were observed compared of those of 1990 (1225.7 km2) and a decrease of the forest surface, moving the agricultural limits uphill, and showing also some agave cultivation in the northwest and north hillslopes of the Nevado de Colima. This increment of the agricultural surface results in bigger economic activity in the area, which makes that the vulnerability also be increased to different volcanic products emitted during this phase of activity. The degradation of the soil by the

  10. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  11. Interdependent networks: vulnerability analysis and strategies to limit cascading failure

    NASA Astrophysics Data System (ADS)

    Fu, Gaihua; Dawson, Richard; Khoury, Mehdi; Bullock, Seth

    2014-07-01

    Network theory is increasingly employed to study the structure and behaviour of social, physical and technological systems — including civil infrastructure. Many of these systems are interconnected and the interdependencies between them allow disruptive events to propagate across networks, enabling damage to spread far beyond the immediate footprint of disturbance. In this research we experiment with a model to characterise the configuration of interdependencies in terms of direction, redundancy, and extent, and we analyse the performance of interdependent systems with a wide range of possible coupling modes. We demonstrate that networks with directed dependencies are less robust than those with undirected dependencies, and that the degree of redundancy in inter-network dependencies can have a differential effect on robustness depending on the directionality of the dependencies. As interdependencies between many real-world systems exhibit these characteristics, it is likely that many such systems operate near their critical thresholds. The vulnerability of an interdependent network is shown to be reducible in a cost effective way, either by optimising inter-network connections, or by hardening high degree nodes. The results improve understanding of the influence of interdependencies on system performance and provide insight into how to mitigate associated risks.

  12. Seismic fragility analysis of buried steel piping at P, L, and K reactors

    SciTech Connect

    Wingo, H.E.

    1989-10-01

    Analysis of seismic strength of buried cooling water piping in reactor areas is necessary to evaluate the risk of reactor operation because seismic events could damage these buried pipes and cause loss of coolant accidents. This report documents analysis of the ability of this piping to withstand the combined effects of the propagation of seismic waves, the possibility that the piping may not behave in a completely ductile fashion, and the distortions caused by relative displacements of structures connected to the piping.

  13. Spectrum analysis of seismic surface waves and its applications in seismic landmine detection.

    PubMed

    Alam, Mubashir; McClellan, James H; Scott, Waymond R

    2007-03-01

    In geophysics, spectrum analysis of surface waves (SASW) refers to a noninvasive method for soil characterization. However, the term spectrum analysis can be used in a wider sense to mean a method for determining and identifying various modes of seismic surface waves and their properties such as velocity, polarization, etc. Surface waves travel along the free boundary of a medium and can be easily detected with a transducer placed on the free surface of the boundary. A new method based on vector processing of space-time data obtained from an array of triaxial sensors is proposed to produce high-resolution, multimodal spectra from surface waves. Then individual modes can be identified in the spectrum and reconstructed in the space-time domain; also, reflected waves can be separated easily from forward waves in the spectrum domain. This new SASW method can be used for detecting and locating landmines by analyzing the reflected waves for resonance. Processing examples are presented for numerically generated data, experimental data collected in a laboratory setting, and field data.

  14. Seismic analysis of wind turbines in the time domain

    NASA Astrophysics Data System (ADS)

    Witcher, D.

    2005-01-01

    The analysis of wind turbine loading associated with earthquakes is clearly important when designing for and assessing the feasibility of wind farms in seismically active regions. The approach taken for such analysis is generally based on codified methods which have been developed for the assessment of seismic loads acting on buildings. These methods are not able to deal properly with the aeroelastic interaction of the dynamic motion of the wind turbine structure with either the wind loading acting on the rotor blades or the response of the turbine controller. This article presents an alternative approach, which is to undertake the calculation in the time domain. In this case a full aeroelastic model of the wind turbine subject to turbulent wind loading is further excited by ground motion corresponding to the earthquake. This capability has been introduced to the GH Bladed wind turbine simulation package. The software can be used to compute the combined wind and earthquake loading of a wind turbine given a definition of the external conditions for an appropriate series of load cases. This article discusses the method and presents example results. Copyright

  15. Social vulnerability assessment using spatial multi-criteria analysis (SEVI model) and the Social Vulnerability Index (SoVI model) - a case study for Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Armaş, I.; Gavriş, A.

    2013-06-01

    In recent decades, the development of vulnerability frameworks has enlarged the research in the natural hazards field. Despite progress in developing the vulnerability studies, there is more to investigate regarding the quantitative approach and clarification of the conceptual explanation of the social component. At the same time, some disaster-prone areas register limited attention. Among these, Romania's capital city, Bucharest, is the most earthquake-prone capital in Europe and the tenth in the world. The location is used to assess two multi-criteria methods for aggregating complex indicators: the social vulnerability index (SoVI model) and the spatial multi-criteria social vulnerability index (SEVI model). Using the data of the 2002 census we reduce the indicators through a factor analytical approach to create the indices and examine if they bear any resemblance to the known vulnerability of Bucharest city through an exploratory spatial data analysis (ESDA). This is a critical issue that may provide better understanding of the social vulnerability in the city and appropriate information for authorities and stakeholders to consider in their decision making. The study emphasizes that social vulnerability is an urban process that increased in a post-communist Bucharest, raising the concern that the population at risk lacks the capacity to cope with disasters. The assessment of the indices indicates a significant and similar clustering pattern of the census administrative units, with an overlap between the clustering areas affected by high social vulnerability. Our proposed SEVI model suggests adjustment sensitivity, useful in the expert-opinion accuracy.

  16. DARHT Multi-intelligence Seismic and Acoustic Data Analysis

    SciTech Connect

    Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.

    2016-07-21

    The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, and finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where

  17. A new approach for computing a flood vulnerability index using cluster analysis

    NASA Astrophysics Data System (ADS)

    Fernandez, Paulo; Mourato, Sandra; Moreira, Madalena; Pereira, Luísa

    2016-08-01

    A Flood Vulnerability Index (FloodVI) was developed using Principal Component Analysis (PCA) and a new aggregation method based on Cluster Analysis (CA). PCA simplifies a large number of variables into a few uncorrelated factors representing the social, economic, physical and environmental dimensions of vulnerability. CA groups areas that have the same characteristics in terms of vulnerability into vulnerability classes. The grouping of the areas determines their classification contrary to other aggregation methods in which the areas' classification determines their grouping. While other aggregation methods distribute the areas into classes, in an artificial manner, by imposing a certain probability for an area to belong to a certain class, as determined by the assumption that the aggregation measure used is normally distributed, CA does not constrain the distribution of the areas by the classes. FloodVI was designed at the neighbourhood level and was applied to the Portuguese municipality of Vila Nova de Gaia where several flood events have taken place in the recent past. The FloodVI sensitivity was assessed using three different aggregation methods: the sum of component scores, the first component score and the weighted sum of component scores. The results highlight the sensitivity of the FloodVI to different aggregation methods. Both sum of component scores and weighted sum of component scores have shown similar results. The first component score aggregation method classifies almost all areas as having medium vulnerability and finally the results obtained using the CA show a distinct differentiation of the vulnerability where hot spots can be clearly identified. The information provided by records of previous flood events corroborate the results obtained with CA, because the inundated areas with greater damages are those that are identified as high and very high vulnerability areas by CA. This supports the fact that CA provides a reliable FloodVI.

  18. Vulnerability of Thai rice production to simultaneous climate and socioeconomic changes: a double exposure analysis

    NASA Astrophysics Data System (ADS)

    Sangpenchan, R.

    2011-12-01

    This research explores the vulnerability of Thai rice production to simultaneous exposure by climate and socioeconomic change -- so-called "double exposure." Both processes influence Thailand's rice production system, but the vulnerabilities associated with their interactions are unknown. To understand this double exposure, I adopts a mixed-method, qualitative-quantitative analytical approach consisting of three phases of analysis involving a Vulnerability Scoping Diagram, a Principal Component Analysis, and the EPIC crop model using proxy datasets collected from secondary data sources at provincial scales.The first and second phases identify key variables representing each of the three dimensions of vulnerability -- exposure, sensitivity, and adaptive capacity indicating that the greatest vulnerability in the rice production system occurs in households and areas with high exposure to climate change, high sensitivity to climate and socioeconomic stress, and low adaptive capacity. In the third phase, the EPIC crop model simulates rice yields associated with future climate change projected by CSIRO and MIROC climate models. Climate change-only scenarios project the decrease in yields by 10% from the current productivity during 2016-2025 and 30% during 2045-2054. Scenarios applying both climate change and improved technology and management practices show that a 50% increase in rice production is possible, but requires strong collaboration between sectors to advance agricultural research and technology and requires strong adaptive capacity in the rice production system characterized by well-developed social capital, social networks, financial capacity, and infrastructure and household mobility at the local scale. The vulnerability assessment and climate and crop adaptation simulations used here provide useful information to decision makers developing vulnerability reduction plans in the face of concurrent climate and socioeconomic change.

  19. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  20. MSNoise: A framework for Continuous Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Lecocq, Thomas; Caudron, Corentin; De Plaen, Raphaël; Mordret, Aurélien

    2016-04-01

    MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data to focus e.g. on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, one could experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. We present the last major evolution of MSNoise from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Small-scale plugins will be shown as examples, such as "continuous PPSD" (à la McNamarra & Buland) or "Seismic Amplitude Ratio Analysis" (Taisne, Caudron). We will also present the new MSNoise-TOMO package, using MSNoise as a "cross-correlation" toolbox and demystifying surface wave tomography ! Finally, the poster will be a meeting point for all those using or willing to use MSNoise, to meet the developer, exchange ideas and wishes !

  1. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  2. Exploring drought vulnerability in Africa: an indicator based analysis to be used in early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2014-05-01

    We propose a composite drought vulnerability indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level for four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation, is less vulnerable to drought; furthermore, we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided into the following geographical regions: the Mediterranean coast of Africa; the Sahel region and the Horn of Africa; the Serengeti and the Eastern Miombo woodlands in eastern Africa; the western part of the Zambezi Basin, the southeastern border of the Congo Basin, and the belt of Fynbos in the Western Cape province of South Africa. The results of the DVI at the country level were

  3. Latest development in seismic texture analysis for subsurface structure, facies, and reservoir characterization: A review

    SciTech Connect

    Gao, Dengliang

    2011-03-01

    In exploration geology and geophysics, seismic texture is still a developing concept that has not been sufficiently known, although quite a number of different algorithms have been published in the literature. This paper provides a review of the seismic texture concepts and methodologies, focusing on latest developments in seismic amplitude texture analysis, with particular reference to the gray level co-occurrence matrix (GLCM) and the texture model regression (TMR) methods. The GLCM method evaluates spatial arrangements of amplitude samples within an analysis window using a matrix (a two-dimensional histogram) of amplitude co-occurrence. The matrix is then transformed into a suite of texture attributes, such as homogeneity, contrast, and randomness, which provide the basis for seismic facies classification. The TMR method uses a texture model as reference to discriminate among seismic features based on a linear, least-squares regression analysis between the model and the data within an analysis window. By implementing customized texture model schemes, the TMR algorithm has the flexibility to characterize subsurface geology for different purposes. A texture model with a constant phase is effective at enhancing the visibility of seismic structural fabrics, a texture model with a variable phase is helpful for visualizing seismic facies, and a texture model with variable amplitude, frequency, and size is instrumental in calibrating seismic to reservoir properties. Preliminary test case studies in the very recent past have indicated that the latest developments in seismic texture analysis have added to the existing amplitude interpretation theories and methodologies. These and future developments in seismic texture theory and methodologies will hopefully lead to a better understanding of the geologic implications of the seismic texture concept and to an improved geologic interpretation of reflection seismic amplitude

  4. Requalification analysis of a circular composite slab for seismic load

    SciTech Connect

    Srinivasan, M.G.; Kot, C.A.

    1992-11-01

    The circular roof slab of an existing facility was analyzed to requalify the structure for supporting a significant seismic load that it was not originally designed for. The slab has a clear span of 66 ft and consists of a 48 in thick reinforced concrete member and a steel liner plate. Besides a number of smaller penetrations, the slab contains two significant cutouts: a 9 ft square opening and a 3 ft dia hole. The issues that complicated the analysis of this non-typical structure, i.e., composite action and nonlinear stiffness of reinforced concrete (R. C.) sections, are discussed. It was possible to circumvent the difficulties by making conservative and simplifying assumptions. If codes incorporate guidelines on practical methods for dynamic analysis of R. C. structures, some of the unneeded conservatism could be eliminated in future designs.

  5. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  6. Multi-hole seismic modeling in 3-D space and cross-hole seismic tomography analysis for boulder detection

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; Liu, Jiangping; Wang, Jing; Zong, Yuquan; Yu, Mingyu

    2016-11-01

    A boulder stone, a common geological feature in south China, is referred to the remnant of a granite body which has been unevenly weathered. Undetected boulders could adversely impact the schedule and safety of subway construction when using tunnel boring machine (TBM) method. Therefore, boulder detection has always been a key issue demanded to be solved before the construction. Nowadays, cross-hole seismic tomography is a high resolution technique capable of boulder detection, however, the method can only solve for velocity in a 2-D slice between two wells, and the size and central position of the boulder are generally difficult to be accurately obtained. In this paper, the authors conduct a multi-hole wave field simulation and characteristic analysis of a boulder model based on the 3-D elastic wave staggered-grid finite difference theory, and also a 2-D imaging analysis based on first arrival travel time. The results indicate that (1) full wave field records could be obtained from multi-hole seismic wave simulations. Simulation results describe that the seismic wave propagation pattern in cross-hole high-velocity spherical geological bodies is more detailed and can serve as a basis for the wave field analysis. (2) When a cross-hole seismic section cuts through the boulder, the proposed method provides satisfactory cross-hole tomography results; however, when the section is closely positioned to the boulder, such high-velocity object in the 3-D space would impact on the surrounding wave field. The received diffracted wave interferes with the primary wave and in consequence the picked first arrival travel time is not derived from the profile, which results in a false appearance of high-velocity geology features. Finally, the results of 2-D analysis in 3-D modeling space are comparatively analyzed with the physical model test vis-a-vis the effect of high velocity body on the seismic tomographic measurements.

  7. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  8. Earthquake prediction in Japan and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    ' (SES) data are available as in Greece, the natural time analysis of the seismicity after the initiation of the SES allows the determination of the time window of the impending mainshock through the evolution of the value of κ1 itself. It was found to work also for the 1989 M7.1 Loma Prieta earthquake. If SES data are not available, we solely rely on the evolution of the fluctuations of κ1 obtained by computing κ1 values using a natural time window of certain length sliding through the earthquake catalog. The fluctuations of the order parameter, in terms of variability, i. e., standard deviation divided by average, was found to increase dramatically when approaching the 11 March M9 super- giant earthquake. In fact, such increase was also found for M7.1 Kobe in 1995, M8.0 Tokachi-oki in 2003 and Landers and Hector-Mines earthquakes in Southern California. It is worth mentioning that such increase is obtained straghtforwardly from ordinary earthquake catalogs without any adjustable parameters.

  9. Analysis of Treasure Island earthquake data using seismic interferometry

    NASA Astrophysics Data System (ADS)

    Mehta, K.; Snieder, R.; Graizer, V.

    2005-12-01

    Seismic interferometry is a powerful tool in extracting the response of ground motion. We show the use of seismic interferometry for analysis of an earthquake recorded by Treasure Island Geotechnical Array near San Francisco, California on 06/26/94. It was a magnitude 4.0 earthquake located at a depth of 6.6 km and distance of 12.6 km from the sensors in borehole. There were six 3-component sensors located at different depths. This problem is similar to the analysis by Snieder and Safak for the Robert A. Millikan Library in Pasadena, California where they deconvolve the recorded wavefield at each of the library floors with the top floor to see the upgoing and the downgoing waves and using that, estimate a shear velocity and a quality factor. They have also shown that for such applications of seismic interferometry, deconvolution of waveforms is superior to correlation. For the Treasure Island data, deconvolving the vertical component of the wavefield for each sensors with the sensor at the surface gives a similar superposition of an upgoing and a downgoing wave. The velocity of these waves agrees well with the compressional wave velocity. We compute the radial and the transverse components. When we window the shear wave arrivals in transverse components at each depth and deconvolve with the one on the surface, the resultant up and down going waves travel with the shear wave velocity. Similar windowing and deconvolution for the radial component also agrees with the shear wave velocity. However, when the radial component is windowed around the compressional waves and deconvolved, the up and the down going waves travel with the shear wave velocity. In the absence of any P to S conversion, the deconvolved waves should be travelling with compressional wave velocity. This suggests that there is a conversion at a depth below the deepest sensor. Receiver functions, defined as the spectral ratio of the radial component with vertical component, can be used to characterize

  10. A unified methodology for seismic waveform analysis and inversion

    NASA Astrophysics Data System (ADS)

    Chen, Po

    A central problem of seismology is the inversion of regional waveform data for models of earthquake sources and earth structure. In regions such as Southern California, preliminary 3D earth models are already available, and efficient numerical methods have been developed for solving the point-source forward problem. We describe a unified inversion procedure that utilizes these capabilities to improve 3D earth models and derive centroid moment tensor (CMT) or finite moment tensor (FMT) representations of earthquake ruptures. Our data are time- and frequency-localized measurements of the phase and amplitude anomalies relative to synthetic seismograms computed from reference seismic source and structure models. Our analysis on these phase and amplitude measurements shows that these preliminary 3D models provide substantially better fit to observed data than either laterally homogeneous or path-averaged 1D structure models that are commonly used in previous seismic studies for Southern California. And we found a small but statistically significant polarization anisotropy in the upper crust that might be associated with basin layering effect. Using the same type of phase and amplitude measurements, we resolved finite source properties for about 40 earthquakes in the Los Angeles basin area. Our results on a cluster of events in the Yorba Linda area show left-lateral faulting conjugate to the nearby right-lateral Whittier fault and are consistent with the "escaping-block" hypothesis about regional tectonics around Los Angeles basin. Our analysis on 16 events in a seismicity trend that extends southwest from Fotana to Puente Hills show right-lateral mechanism that is conjugate to the trend of the hypocenter distribution, suggesting a developing weak-zone that might be related to such "escaping" deformation. To set up the structural inverse problem, we computed 3D sensitivity kernels for our phase and amplitude measurements using the 3D SCEC CVM as the reference model and

  11. Surface-Source Downhole Seismic Analysis in R

    USGS Publications Warehouse

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  12. Energetic analysis of the white light emission associated to seismically active flares in solar cycle 24

    NASA Astrophysics Data System (ADS)

    Buitrago-Casas, Juan Camilo; Martinez Oliveros, Juan Carlos; Glesener, Lindsay; Krucker, Sam

    2014-06-01

    Solar flares are explosive phenomena, thought to be driven by magnetic free energy accumulated in the solar corona. Some flares release seismic transients, "sunquakes", into the Sun's interior. Different mechanisms are being considered to explain how sunquakes are generated. We are conducting an analysis of white-light emission associated with those seismically active solar flares that have been reported by different authors within the current solar cycle. Seismic diagnostics are based upon standard time-distance techniques, including seismic holography, applied to Dopplergrams obtained by SDO/HMI and GONG. The relation between white-light emissions and seismic activity may provide important information on impulsive chromospheric heating during flares, a prospective contributor to seismic transient emission, at least in some instances. We develop a method to get an estimation of Energy associated whit white-light emission and compare those results whit values of energy needed to generate a sunquake according with holographic helioseismology techniques.

  13. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    NASA Astrophysics Data System (ADS)

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele

    2008-07-01

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  14. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    SciTech Connect

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele

    2008-07-08

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  15. Storey building early monitoring based on rapid seismic response analysis

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Sunardi, Bambang; Rudyanto, Ariska

    2016-05-01

    Within the last decade, advances in the acquisition, processing and transmission of data from seismic monitoring has contributed to the growth in the number structures instrumented with such systems. An equally important factor for such growth can be attributed to the demands by stakeholders to find rapid answers to important questions related to the functionality or state of "health" of structures during and immediately of a seismic events. Consequently, this study aims to monitor the storey building based on seismic response i. e. earthquake and tremor analysis at short time lapse using accelerographs data. This study used one of storey building (X) in Jakarta city that suffered the effects of Kebumen earthquake January 25th 2014, Pandeglang earthquake July 9th 2014, and Lebak earthquake November 8th 2014. Tremors used in this study are tremors after the three following earthquakes. Data processing used to determine peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), spectral acceleration (SA), spectral velocity (SV), spectral displacement (SD), A/V ratio, acceleration amplification and effective duration (te). Then determine the natural frequency (f0) and peak of H/V ratio using H/V ratio method.The earthquakes data processing result shows the value of peak ground motion, spectrum response, A/V ratio and acceleration amplification increases with height, while the value of the effective duration give a different viewpoint of building dynamic because duration of Kebumen earthquake shows the highest energy in the highest floor but Pandeglang and Lebak earthquake in the lowest floor. Then, tremors data processing result one month after each earthquakes shows the natural frequency of building in constant value. Increasing of peak ground motion, spectrum response, A/V ratio, acceleration amplification, then decrease of effective duration following the increase of building floors shows that the building construction supports the

  16. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  17. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  18. Letter report seismic shutdown system failure mode and effect analysis

    SciTech Connect

    KECK, R.D.

    1999-09-01

    The Supply Ventilation System Seismic Shutdown ensures that the 234-52 building supply fans, the dry air process fans and vertical development calciner are shutdown following a seismic event. This evaluates the failure modes and determines the effects of the failure modes.

  19. Analysis of seismic noise recorded by temporary seismic array near the Pyhäsalmi underground mine in Finland

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena; Narkilahti, Janne; Nevalainen, Jouni

    2016-04-01

    The Pyhäsalmi mine is an underground copper and zinc mine located in central Finland. It is one of the oldest and deepest underground mines in Europe, in which ore is excavated from the depth of about 1450 m. Due to the large amount of heavy machinery, the mine itself is a source of strong seismic and acoustic noise. This continuous noise creates a problem for high-resolution active source seismic experiments. That is why in out study we investigated the opportunity to use this seismic noise for studying structure of the uppermost crust. For this we installed 24 3-component DSU-SA MEMS seismic sensors with the autonomous RAUD eX data acquisition units produced by Sercel Ltd. along a 10 km long line crossing the mine area. The array recorded continuous seismic data from 29.10.2013 to 1.11.2013 with the sampling rate of 500 sps. The continuous data for the period 5 days were processed in several steps including single station data analysis, pre-filtering and time-domain stacking. The processed data set was used to estimate empirical Green's functions (EGF) between pairs of stations in the frequency band of 1-100 Hz. We developed our own procedure of stacking EGF in time-domain and, as a result, we were able to extract not only Rayleigh, but also refracted P-waves. Finally, we calculated surface wave dispersion curves and solved inversion problems for surface waves and refracted waves. In our paper we concentrate mainly on details of our data processing routine and its influence on quality of results of EGF extraction. The study is a part of SEISLAB project funded by he European Regional Development Fund (ERDF), Council of Oulu region (Finland) and Pyhäsalmi Mine Oy.

  20. Pembina Cardium CO2-EOR monitoring project: Integrated surface seismic and VSP time-lapse seismic analysis

    NASA Astrophysics Data System (ADS)

    Alshuhail, A. A.

    2009-12-01

    In the Pembina field in west-central Alberta, Canada, approximately 40,000 tons of supercritical CO2 was injected into the 1650 m deep, 20 m thick upper-Cretaceous Cardium Fm. between March 2005 and 2007. A time-lapse seismic program was designed and incorporated into the overall measurement, monitoring and verification program. The objectives were to track the CO2 plume within the reservoir, and to evaluate the integrity of storage. Fluid replacement modeling predicts a decrease in the P-wave velocity and bulk density in the reservoir by about 4% and 1%, respectively. Synthetic seismograms show subtle reflectivity changes at the Cardium Fm. and a traveltime delay at the later high-amplitude Viking event of less than 1 ms. The time-lapse datasets, however, show no significant anomalies in the P-wave seismic data that can be attributed to supercritical CO2 injected into the Cardium Fm. (Figure 1). The converted-wave (P-S) data, on the other hand, showed small traveltime anomalies. The most coherent results were those obtained by the fixed-array VSP dataset (Figure 2) due to higher frequency bandwidth and high signal to noise ratio. The amplitude and traveltime changes observed in the VSP dataset are small but are consistent in magnitude with those predicted from rock physics modeling. The analysis suggests that the inability to clearly detect the CO2 plume in surface seismic data is likely due to the CO2 being contained in thin permeable sandstone members of the Cardium Formation. The seismic signature of the Cardium Fm. in this area may also be degraded by multiples and strong attenuation involving the shallow Ardley coals. However, the lack of a 4D seismic changes above the reservoir indicates that the injected CO2 is not migrating through the caprock into shallower formations.

  1. Reservoir lithofacies analysis using 3D seismic data in dissimilarity space

    NASA Astrophysics Data System (ADS)

    Bagheri, M.; Riahi, M. A.; Hashemi, H.

    2013-06-01

    Seismic data interpretation is one of the most important steps in exploration seismology. Seismic facies analysis (SFA) with emphasis on lithofacies can be used to extract more information about structures and geology, which results in seismic interpretation enhancement. Facies analysis is based on unsupervised and supervised classification using seismic attributes. In this paper, supervised classification by a support vector machine using well logs and seismic attributes is applied. Dissimilarity as a new measuring space is employed, after which classification is carried out. Often, SFA is carried out in a feature space in which each dimension stands as a seismic attribute. Different facies show lots of class overlap in the feature space; hence, high classification error values are reported. Therefore, decreasing class overlap before classification is a necessary step to be targeted. To achieve this goal, a dissimilarity space is initially created. As a result of the definition of the new space, the class overlap between objects (seismic samples) is reduced and hence the classification can be done reliably. This strategy causes an increase in the accuracy of classification, and a more trustworthy lithofacies analysis is attained. For applying this method, 3D seismic data from an oil field in Iran were selected and the results obtained by a support vector classifier (SVC) in dissimilarity space are presented, discussed and compared with the SVC applied in conventional feature space.

  2. Seismic signature analysis for discrimination of people from animals

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Mehmood, Asif; Sabatier, James M.

    2013-05-01

    Cadence analysis has been the main focus for discriminating between the seismic signatures of people and animals. However, cadence analysis fails when multiple targets are generating the signatures. We analyze the mechanism of human walking and the signature generated by a human walker, and compare it with the signature generated by a quadruped. We develop Fourier-based analysis to differentiate the human signatures from the animal signatures. We extract a set of basis vectors to represent the human and animal signatures using non-negative matrix factorization, and use them to separate and classify both the targets. Grazing animals such as deer, cows, etc., often produce sporadic signals as they move around from patch to patch of grass and one must characterize them so as to differentiate their signatures from signatures generated by a horse steadily walking along a path. These differences in the signatures are used in developing a robust algorithm to distinguish the signatures of animals from humans. The algorithm is tested on real data collected in a remote area.

  3. 75 FR 13610 - Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-22

    ... COMMISSION Office of New Reactors; Interim Staff Guidance on Implementation of a Seismic Margin Analysis for.../ COL-ISG-020 titled ``Implementation of a Seismic Margin Analysis for New Reactors Based on..., the NRC staff issued the proposed ISG, DC/COL-ISG-020 ``Implementation of a Seismic Margin...

  4. Array analysis methods for detection, classification and location of seismic sources: a first evaluation for aftershock analysis using dense temporary post-seismic array network

    NASA Astrophysics Data System (ADS)

    Poiata, N.; Satriano, C.; Vilotte, J.; Bernard, P.

    2012-12-01

    Detection, separation, classification and location of distributed non stationary seismic sources in broadband noisy environment is an important problem in seismology, in particular for monitoring the high-level post-seismic activity following large subduction earthquakes, like the off-shore Maule (Mw 8.8, 2010) earthquake in Central Chile. Multiple seismic arrays, and local antenna, distributed over a region allow exploiting frequency selective coherence of the signals that arrive at widely-separated array stations, leading to improved detection, convolution blind source separation, and location of distributed non stationary sources. We present here first results on the investigation of time-frequency adaptive array analysis techniques for detection and location of broadband distributed seismic events recorded by the dense temporary seismic network (International Maule Aftershock Deployment, IMAD) installed for monitoring the high-level seismic activity following the 27 February 2010 Maule earthquake (Mw 8.8). This seismic network is characterized by a large aperture, with variable inter-station distances, corroborated with a high level of distributed near and far field seismic source activity and noise. For this study, we first extract from the post-seismic network a number of seismic arrays distributed over the region covered by this network. A first aspect is devoted to passive distributed seismic sources detection, classification and separation. We investigate a number of narrow and wide band signal analysis methods both in time and time-frequency domains for energy arrival detection and tracking, including time adaptive higher order statistics, e.g. like kurtosis, and multiband band-pass filtering, together with adaptive time-frequency transformation and extraction techniques. We demonstrate that these techniques provide superior resolution and robustness than classical STA/LTA techniques in particular in the case of distributed sources with potential signal

  5. Calibrating Nonlinear Soil Material Properties for Seismic Analysis Using Soil Material Properties Intended for Linear Analysis

    SciTech Connect

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-08-01

    Seismic analysis of nuclear structures is routinely performed using guidance provided in “Seismic Analysis of Safety-Related Nuclear Structures and Commentary (ASCE 4, 1998).” This document, which is currently under revision, provides detailed guidance on linear seismic soil-structure-interaction (SSI) analysis of nuclear structures. To accommodate the linear analysis, soil material properties are typically developed as shear modulus and damping ratio versus cyclic shear strain amplitude. A new Appendix in ASCE 4-2014 (draft) is being added to provide guidance for nonlinear time domain SSI analysis. To accommodate the nonlinear analysis, a more appropriate form of the soil material properties includes shear stress and energy absorbed per cycle versus shear strain. Ideally, nonlinear soil model material properties would be established with soil testing appropriate for the nonlinear constitutive model being used. However, much of the soil testing done for SSI analysis is performed for use with linear analysis techniques. Consequently, a method is described in this paper that uses soil test data intended for linear analysis to develop nonlinear soil material properties. To produce nonlinear material properties that are equivalent to the linear material properties, the linear and nonlinear model hysteresis loops are considered. For equivalent material properties, the shear stress at peak shear strain and energy absorbed per cycle should match when comparing the linear and nonlinear model hysteresis loops. Consequently, nonlinear material properties are selected based on these criteria.

  6. Sampling and Analysis Plan Waste Treatment Plant Seismic Boreholes Project.

    SciTech Connect

    Brouns, Thomas M.

    2007-07-15

    This sampling and analysis plan (SAP) describes planned data collection activities for four entry boreholes through the sediment overlying the Saddle Mountains Basalt, up to three new deep rotary boreholes through the Saddle Mountains Basalt and sedimentary interbeds, and one corehole through the Saddle Mountains Basalt and sedimentary interbeds at the Waste Treatment Plant (WTP) site. The SAP will be used in concert with the quality assurance plan for the project to guide the procedure development and data collection activities needed to support borehole drilling, geophysical measurements, and sampling. This SAP identifies the American Society of Testing Materials standards, Hanford Site procedures, and other guidance to be followed for data collection activities. Revision 3 incorporates all interim change notices (ICN) that were issued to Revision 2 prior to completion of sampling and analysis activities for the WTP Seismic Boreholes Project. This revision also incorporates changes to the exact number of samples submitted for dynamic testing as directed by the U.S. Army Corps of Engineers. Revision 3 represents the final version of the SAP.

  7. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  8. Governing Geoengineering Research: A Political and Technical Vulnerability Analysis of Potential Near-Term Options

    DTIC Science & Technology

    2011-01-01

    weather events, or the spread of tropical diseases into North America. The net A Vulnerability-and-Response-Option Analysis Framework for a Risk...Avoidable Surprises, Cambridge: Cambridge University Press, 2002. Doney, Scott C., Victoria J. Fabry , Richard A. Feely, and Joan A. Kleypas, “Ocean...Falkenmark, Louise Karlberg, Robert W. Corell, Victoria J. Fabry , James Hansen, Brian Walker, Diana Liverman, Katherine Richardson, Paul Crutzen, and

  9. Social class variation in risk: a comparative analysis of the dynamics of economic vulnerability.

    PubMed

    Whelan, Christopher T; Maître, Bertrand

    2008-12-01

    A joint concern with multidimensionality and dynamics is a defining feature of the pervasive use of the terminology of social exclusion in the European Union. The notion of social exclusion focuses attention on economic vulnerability in the sense of exposure to risk and uncertainty. Sociological concern with these issues has been associated with the thesis that risk and uncertainty have become more pervasive and extend substantially beyond the working class. This paper combines features of recent approaches to statistical modelling of poverty dynamics and multidimensional deprivation in order to develop our understanding of the dynamics of economic vulnerability. An analysis involving nine countries and covering the first five waves of the European Community Household Panel shows that, across nations and time, it is possible to identify an economically vulnerable class. This class is characterized by heightened risk of falling below a critical resource level, exposure to material deprivation and experience of subjective economic stress. Cross-national differentials in persistence of vulnerability are wider than in the case of income poverty and less affected by measurement error. Economic vulnerability profiles vary across welfare regimes in a manner broadly consistent with our expectations. Variation in the impact of social class within and across countries provides no support for the argument that its role in structuring such risk has become much less important. Our findings suggest that it is possible to accept the importance of the emergence of new forms of social risk and acknowledge the significance of efforts to develop welfare states policies involving a shift of opportunities and decision making on to individuals without accepting the 'death of social class' thesis.

  10. Seismic data interpretation using the Hough transform and principal component analysis

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Martin, R.; Ávila-Carrera, R.; Rodríguez-Castellanos, A.

    2011-03-01

    In this work two novel image processing techniques are applied to detect and delineate complex salt bodies from seismic exploration profiles: Hough transform and principal component analysis (PCA). It is well recognized by the geophysical community that the lack of resolution and poor structural identification in seismic data recorded at sub-salt plays represent severe technical and economical problems. Under such circumstances, seismic interpretation based only on the human-eye is inaccurate. Additionally, petroleum field development decisions and production planning depend on good-quality seismic images that generally are not feasible in salt tectonics areas. In spite of this, morphological erosion, region growing and, especially, a generalization of the Hough transform (closely related to the Radon transform) are applied to build parabolic shapes that are useful in the idealization and recognition of salt domes from 2D seismic profiles. In a similar way, PCA is also used to identify shapes associated with complex salt bodies in seismic profiles extracted from 3D seismic data. To show the validity of the new set of seismic results, comparisons between both image processing techniques are exhibited. It is remarkable that the main contribution of this work is oriented in providing the seismic interpreters with new semi-automatic computational tools. The novel image processing approaches presented here may be helpful in the identification of diapirs and other complex geological features from seismic images. Conceivably, in the near future, a new branch of seismic attributes could be recognized by geoscientists and engineers based on the encouraging results reported here.

  11. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  12. Analysis of the induced seismicity of the Lacq gas field (Southwestern France) and model of deformation

    NASA Astrophysics Data System (ADS)

    Bardainne, T.; Dubos-Sallée, N.; Sénéchal, G.; Gaillot, P.; Perroud, H.

    2008-03-01

    The goal of this paper is to propose a model of deformation pattern for the Lacq gas field (southwest of France), considering the temporal and spatial evolution of the observed induced seismicity. This model of deformation has been determined from an updating of the earthquake locations and considering theoretical and analogue models usually accepted for hydrocarbon field deformation. The Lacq seismicity is clearly not linked to the natural seismicity of the Pyrenean range recorded 30km farther to the south since the first event was felt in 1969, after the beginning of the hydrocarbon recovery. From 1974 to 1997, more than 2000 local events (ML < 4.2) have been recorded by two permanent local seismic networks. Unlike previously published results focusing on limited time lapse studies, our analysis relies on the data from 1974 to 1997. Greater accuracy of the absolute locations have been obtained using a well adapted algorithm of 3-D location, after improvement of the 3-D P-wave velocity model and determination of specific station corrections for different clusters of events. This updated catalogue of seismicity has been interpreted taking into account the structural context of the gas field. The Lacq gas field is an anticlinal reservoir where 3-D seismic and borehole data reveal a pattern of high density of fracturing, mainly oriented WNW-ESE. Seismicity map and vertical cross-sections show that majority of the seismic events (70 per cent) occurred above the gas reservoir. Correlation is also observed between the orientation of the pre-existent faults and the location of the seismic activity. Strong and organized seismicity occurred where fault orientation is consistent with the poroelastic stress perturbation due to the gas recovery. On the contrary, the seismicity is quiescient where isobaths of the reservoir roof are closed to be perpendicular to the faults. These quiescient areas as well as the central seismic part are characterized by a surface subsidence

  13. Discrimination of porosity and fluid saturation using seismic velocity analysis

    DOEpatents

    Berryman, James G.

    2001-01-01

    The method of the invention is employed for determining the state of saturation in a subterranean formation using only seismic velocity measurements (e.g., shear and compressional wave velocity data). Seismic velocity data collected from a region of the formation of like solid material properties can provide relatively accurate partial saturation data derived from a well-defined triangle plotted in a (.rho./.mu., .lambda./.mu.)-plane. When the seismic velocity data are collected over a large region of a formation having both like and unlike materials, the method first distinguishes the like materials by initially plotting the seismic velocity data in a (.rho./.lambda., .mu./.lambda.)-plane to determine regions of the formation having like solid material properties and porosity.

  14. Seismicity monitoring by cluster analysis of moment tensors

    NASA Astrophysics Data System (ADS)

    Cesca, Simone; Şen, Ali Tolga; Dahm, Torsten

    2014-03-01

    We suggest a new clustering approach to classify focal mechanisms from large moment tensor catalogues, with the purpose of automatically identify families of earthquakes with similar source geometry, recognize the orientation of most active faults, and detect temporal variations of the rupture processes. The approach differs in comparison to waveform similarity methods since clusters are detected even if they occur in large spatial distances. This approach is particularly helpful to analyse large moment tensor catalogues, as in microseismicity applications, where a manual analysis and classification is not feasible. A flexible algorithm is here proposed: it can handle different metrics, norms, and focal mechanism representations. In particular, the method can handle full moment tensor or constrained source model catalogues, for which different metrics are suggested. The method can account for variable uncertainties of different moment tensor components. We verify the method with synthetic catalogues. An application to real data from mining induced seismicity illustrates possible applications of the method and demonstrate the cluster detection and event classification performance with different moment tensor catalogues. Results proof that main earthquake source types occur on spatially separated faults, and that temporal changes in the number and characterization of focal mechanism clusters are detected. We suggest that moment tensor clustering can help assessing time dependent hazard in mines.

  15. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  16. A relative vulnerability estimation of flood disaster using data envelopment analysis in the Dongting Lake region of Hunan

    NASA Astrophysics Data System (ADS)

    Li, C.-H.; Li, N.; Wu, L.-C.; Hu, A.-J.

    2013-07-01

    The vulnerability to flood disaster is addressed by a number of studies. It is of great importance to analyze the vulnerability of different regions and various periods to enable the government to make policies for distributing relief funds and help the regions to improve their capabilities against disasters, yet a recognized paradigm for such studies seems missing. Vulnerability is defined and evaluated through either physical or economic-ecological perspectives depending on the field of the researcher concerned. The vulnerability, however, is the core of both systems as it entails systematic descriptions of flood severities or disaster management units. The research mentioned often has a development perspective, and in this article we decompose the overall flood system into several factors: disaster driver, disaster environment, disaster bearer, and disaster intensity, and take the interaction mechanism among all factors as an indispensable function. The conditions of flood disaster components are demonstrated with disaster driver risk level, disaster environment stability level and disaster bearer sensitivity, respectively. The flood system vulnerability is expressed as vulnerability = f(risk, stability, sensitivity). Based on the theory, data envelopment analysis method (DEA) is used to detail the relative vulnerability's spatiotemporal variation of a flood disaster system and its components in the Dongting Lake region. The study finds that although a flood disaster system's relative vulnerability is closely associated with its components' conditions, the flood system and its components have a different vulnerability level. The overall vulnerability is not the aggregation of its components' vulnerability. On a spatial scale, zones central and adjacent to Dongting Lake and/or river zones are characterized with very high vulnerability. Zones with low and very low vulnerability are mainly distributed in the periphery of the Dongting Lake region. On a temporal

  17. GIS analysis of changes in ecological vulnerability using a SPCA model in the Loess plateau of Northern Shaanxi, China.

    PubMed

    Hou, Kang; Li, Xuxiang; Zhang, Jing

    2015-04-17

    Changes in ecological vulnerability were analyzed for Northern Shaanxi, China using a geographic information system (GIS). An evaluation model was developed using a spatial principal component analysis (SPCA) model containing land use, soil erosion, topography, climate, vegetation and social economy variables. Using this model, an ecological vulnerability index was computed for the research region. Using natural breaks classification (NBC), the evaluation results were divided into five types: potential, slight, light, medium and heavy. The results indicate that there is greater than average optimism about the conditions of the study region, and the ecological vulnerability index (EVI) of the southern eight counties is lower than that of the northern twelve counties. From 1997 to 2011, the ecological vulnerability index gradually decreased, which means that environmental security was gradually enhanced, although there are still some places that have gradually deteriorated over the past 15 years. In the study area, government and economic factors and precipitation are the main reasons for the changes in ecological vulnerability.

  18. Seismic analysis of the large 70-meter antenna. Part 2: General dynamic response and a seismic safety check

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    An extensive dynamic analysis for the new JPL 70-meter antenna structure is presented. Analytical procedures are based on the normal mode decomposition which include dumping and special forcing functions. The dynamic response can be obtained for any arbitrarily selected point on the structure. A new computer program for computing the time-dependent, resultant structural displacement, summing the effects of all participating modes, was developed also. Program compatibility with natural frequency analysis output was verified. The program was applied to the JPL 70-meter antenna structure and the dynamic response for several specially selected points was computed. Seismic analysis of structures, a special application of the general dynamic analysis, is based also on the normal modal decomposition. Strength specification of the antenna, with respect to the earthquake excitation, is done by using the common response spectra. The results indicated basically a safe design under an assumed 5% or more damping coefficient. However, for the antenna located at Goldstone, with more active seismic environment, this study strongly recommends and experimental program that determines the true damping coefficient for a more reliable safety check.

  19. New Methodology for Rapid Seismic Risk Assessment

    NASA Astrophysics Data System (ADS)

    Melikyan, A. E.; Balassanian, S. Y.

    2002-05-01

    Seismic risk is growing worldwide and is, increasingly, a problem of developing countries. Along with growing urbanization future earthquakes will have more disastrous social and economic consequences. Seismic risk assessment and reduction are important goals for each country located in seismically active zone. For Armenia these goals are of primary importance because the results of studies carried out by Armenian NSSP for assessment of the losses caused by various types of disasters in Armenia had shown that earthquakes are the most disastrous hazard for Armenia. The strategy for seismic risk reduction in 1999 was adopted by the Government of Armenia as a high priority state program. The world experience demonstrates that for efficient response the rapid assessment of seismic losses is necessary. There are several state-of-the-art approaches for seismic risk assessment (Radius, Hazus, etc.). All of them required large amount of various input data, which is impossible to collect in many developing countries, in particular in Armenia. Taking into account this very serious problem existing for developing countries, as well as rapid seismic risk assessment need immediately after strong earthquake the author undertake the attempt to contribute into a new approach for rapid seismic risk assessment under the supervision of Prof. S. Balassanian. The analysis of numerous factors influencing seismic risk in Armenia shows that the following elements contribute most significantly to the possible losses: seismic hazard; density of population; vulnerability of structures. Proposed approach for rapid seismic risk assessment based on these three factors has been tested for several seismic events. These tests have shown that such approach might represent from 80 to 90 percent of real losses.

  20. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C; Choun, Y-S; Kim, MK; Choi, I-K

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structures and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5

  1. Comparison between seismic and domestic risk in moderate seismic hazard prone region: the Grenoble City (France) test site

    NASA Astrophysics Data System (ADS)

    Dunand, F.; Gueguen, P.

    2012-02-01

    France has a moderate level of seismic activity, characterized by diffuse seismicity, sometimes experiencing earthquakes of a magnitude of more than 5 in the most active zones. In this seismicity context, Grenoble is a city of major economic and social importance. However, earthquakes being rare, public authorities and the decision makers are only vaguely committed to reducing seismic risk: return periods are long and local policy makers do not have much information available. Over the past 25 yr, a large number of studies have been conducted to improve our knowledge of seismic hazard in this region. One of the decision-making concerns of Grenoble's public authorities, as managers of a large number of public buildings, is to know not only the seismic-prone regions, the variability of seismic hazard due to site effects and the city's overall vulnerability, but also the level of seismic risk and exposure for the entire city, also compared to other natural or/and domestic hazards. Our seismic risk analysis uses a probabilistic approach for regional and local hazards and the vulnerability assessment of buildings. Its applicability to Grenoble offers the advantage of being based on knowledge acquired by previous projects conducted over the years. This paper aims to compare the level of seismic risk with that of other risks and to introduce the notion of risk acceptability in order to offer guidance in the management of seismic risk. This notion of acceptability, which is now part of seismic risk consideration for existing buildings in Switzerland, is relevant in moderately seismic-prone countries like France.

  2. Anisotropic P-wave velocity analysis and seismic imaging in onshore Kutch sedimentary basin of India

    NASA Astrophysics Data System (ADS)

    Behera, Laxmidhar; Khare, Prakash; Sarkar, Dipankar

    2011-08-01

    The long-offset P-wave seismic reflection data has observable non-hyperbolic moveout, which depend on two parameters such as normal moveout velocity ( Vnmo) and the anisotropy parameter( η). Anisotropy (e.g., directional dependence of velocity at a fixed spatial location in a medium) plays an important role in seismic imaging. It is difficult to know the presence of anisotropy in the subsurface geological formations only from P-wave seismic data and special analysis is required for this. The presence of anisotropy causes two major distortions of moveout in P-wave seismic reflection data. First, in contrast to isotropic media, normal-moveout (NMO) velocity differs from the vertical velocity; and the second is substantial increase of deviations in hyperbolic moveout in an anisotropic layer. Hence, with the help of conventional velocity analysis based on short-spread moveout (stacking) velocities do not provide enough information to determine the true vertical velocity in a transversely isotropic media with vertical symmetry axis (VTI media). Therefore, it is essential to estimate the single anisotropic parameter ( η) from the long-offset P-wave seismic data. It has been demonstrated here as a case study with long-offset P-wave seismic data acquired in onshore Kutch sedimentary basin of western India that suitable velocity analysis using Vnmo and η can improve the stacking image obtained from conventional velocity analysis.

  3. Seismic detection and analysis of icequakes at Columbia Glacier, Alaska

    USGS Publications Warehouse

    O'Neel, Shad; Marshall, Hans P.; McNamara, Daniel E.; Pfeffer, William Tad

    2007-01-01

    Contributions to sea level rise from rapidly retreating marine-terminating glaciers are large and increasing. Strong increases in iceberg calving occur during retreat, which allows mass transfer to the ocean at a much higher rate than possible through surface melt alone. To study this process, we deployed an 11-sensor passive seismic network at Columbia Glacier, Alaska, during 2004–2005. We show that calving events generate narrow-band seismic signals, allowing frequency domain detections. Detection parameters were determined using direct observations of calving and validated using three statistical methods and hypocenter locations. The 1–3 Hz detections provide a good measure of the temporal distribution and size of calving events. Possible source mechanisms for the unique waveforms are discussed, and we analyze potential forcings for the observed seismicity.

  4. China's water resources vulnerability: A spatio-temporal analysis during 2003-2013

    NASA Astrophysics Data System (ADS)

    Cai, J.; Varis, O.; Yin, H.

    2015-12-01

    The present highly serious situation of China's water environment and aquatic ecosystems has occurred in the context of its stunning socioeconomic development over the past several decades. Therefore, an analysis with a high spatio-temporal resolution of the vulnerability assessment of water resources (VAWR) in China is burningly needed. However, to our knowledge, the temporal analysis of VAWR has been not yet addressed. Consequently, we performed, for the first time, a comprehensive spatio-temporal analysis of China's water resources vulnerability (WRV), using a composite index approach with an array of aspects highlighting key challenges that China's water resources system is nowadays facing. During our study period of 2003-2013, the political weight of China's integrated water resources management has been increasing continuously. Hence, it is essential and significant, based on the historical socioeconomic changes influenced by water-environment policy making and implementation, to reveal China's WRV for pinpointing key challenges to the healthy functionality of its water resources system. The water resources system in North and Central Coast appeared more vulnerable than that in Western China. China's water use efficiency has grown substantially over the study period, and so is water supply and sanitation coverage. In contrast, water pollution has been worsening remarkably in most parts of China, and so have water scarcity and shortage in the most stressed parts of the country. This spatio-temporal analysis implies that the key challenges to China's water resources system not only root in the geographical mismatch between socioeconomic development (e.g. water demand) and water resources endowments (e.g. water resources availability), but also stem from the intertwinement between socioeconomic development and national strategic policy making.

  5. Application and Validation of a GIS Model for Local Tsunami Vulnerability and Mortality Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.

    2015-12-01

    The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of

  6. A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators

    PubMed Central

    Beccari, Benjamin

    2016-01-01

    related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298

  7. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  8. Geo-ethical dimension of community's safety: rural and urban population vulnerability analysis methodology

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro; Kopachevsky, Ivan; Yuschenko, Maxim

    2016-04-01

    Modern world based on relations more than on causalities, so communicative, socio-economic, and socio-cultural issues are important to understand nature of risks and to make correct, ethical decisions. Today major part of risk analysts declared new nature of modern risks. We faced coherent or systemic risks, realization of which leads to domino effect, unexpected growing of losses and fatalities. This type of risks originated by complicated nature of heterogeneous environment, close interconnection of engineering networks, and changing structure of society. Heterogeneous multi-agent environment generates systemic risks, which requires analyze multi-source data with sophisticated tools. Formal basis for analysis of this type of risks is developed during last 5-7 years. But issues of social fairness, ethics, and education require further development. One aspect of analysis of social issues of risk management is studied in this paper. Formal algorithm for quantitative analysis of multi-source data analysis is proposed. As it was demonstrated, using proposed methodological base and the algorithm, it is possible to obtain regularized spatial-temporal distribution of investigated parameters over whole observation period with rectified reliability and controlled uncertainty. The result of disaster data analysis demonstrates that about half of direct disaster damage might be caused by social factors: education, experience and social behaviour. Using data presented also possible to estimate quantitative parameters of the losses distributions: a relation between education, age, experience, and losses; as well as vulnerability (in terms of probable damage) toward financial status in current social density. It is demonstrated that on wide-scale range an education determines risk perception and so vulnerability of societies. But on the local level there are important heterogeneities. Land-use and urbanization structure influencing to vulnerability essentially. The way to

  9. Preliminary Analysis of Saudi National Seismic Network Recording of the November 1999 Dead Sea Explosions

    SciTech Connect

    Rodgers, A.

    1999-12-01

    Two large chemical explosions were detonated in the Dead Sea on November 10 and 11, 1999 for the purposes of calibrating seismic travel times to improve regional network location. These explosions were large enough to be observed with good signal-to-noise ratios by seismic stations in northwestern Saudi Arabia (distances c 500 km). In this report, we present a preliminary analysis of the recordings from these shots.

  10. Fast seismic velocity analysis using parsimonious Kirchhoff depth migration

    NASA Astrophysics Data System (ADS)

    Fei, Weihong

    Migration-based velocity analysis is the most efficient, and accurate velocity inversion technique. It generally involves time-consuming prestack depth migration, and picking of the depth residuals in common-image gathers (CIGs) in each iteration. Two modifications are proposed to minimize the time of prestack depth migration and the picking work in velocity analysis: one approach is to invert the velocity model in layer-stripping style; the other is based on a grid parametrization of the velocity model. Both approaches are based on the idea of parsimonious depth migration, which is the fastest depth migration currently available. Both approaches have four basic steps: (1) Picking the primary, most consistent reflection events from one reference seismic section or volume. (2) Depending on whether the reference data is 2-D poststack, 2-D common-offset, 3-D poststack, or 3-D common-offset, the corresponding parsimonious depth migration is used to migrate all the picked time samples to their spatial locations and to give their orientations. (3) Ray-tracing defines the CRP gathers for each reflection point. (4) Velocity updating. For the layer-stripping approach, a small (2-3) number of iterations converge to a 2-D model of layer shape and interval velocity. The computation time of this layer-stripping approach is of the same order as that of the standard (1-D) rms velocity scan method, and is much faster than current iterative prestack depth migration velocity analysis methods for typical field data. For the grid-based approach, it is not necessary to define continuous reflectors and that the time at any offset (not only zero offset) can be used as the reference time for a reflection. Truncations, and multi-valued layers, which need much effort in the layer-stripping approach, are handled naturally and implicitly in the grid-based approach. Two important features of the proposed algorithms are: the traveltime picking is limited to only a stacked or common

  11. What meta-analysis can tell us about vulnerability of marine biodiversity to ocean acidification?

    NASA Astrophysics Data System (ADS)

    Dupont, S.; Dorey, N.; Thorndyke, M.

    2010-09-01

    Ocean acidification has been proposed as a major threat for marine biodiversity. Hendriks et al. [Hendriks, I.E., Duarte, C.M., Alvarez, M., 2010. Vulnerability of marine biodiversity to ocean acidification: a meta-analysis. Estuarine, Coastal and Shelf Science, doi:10.1016/j.ecss.2009.11.022.] proposed an alternative view and suggested, based on a meta-analysis, that marine biota may be far more resistant to ocean acidification than hitherto believed. However, such a meta-analytical approach can mask more subtle features, for example differing sensitivities during the life-cycle of an organism. Using a similar metric on an echinoderm database, we show that key bottlenecks present in the life-cycle (e.g. larvae being more vulnerable than adults) and responsible for driving the whole species response may be hidden in a global meta-analysis. Our data illustrate that any ecological meta-analysis should be hypothesis driven, taking into account the complexity of biological systems, including all life-cycle stages and key biological processes. Available data allow us to conclude that near-future ocean acidification can/will have dramatic negative impact on some marine species, including echinoderms, with likely consequences at the ecosystem level.

  12. Seismic soil structure interaction analysis for asymmetrical buildings supported on piled raft for the 2015 Nepal earthquake

    NASA Astrophysics Data System (ADS)

    Badry, Pallavi; Satyam, Neelima

    2017-01-01

    Seismic damage surveys and analyses conducted on modes of failure of structures during past earthquakes observed that the asymmetrical buildings show the most vulnerable effect throughout the course of failures (Wegner et al., 2009). Thus, all asymmetrical buildings significantly fails during the shaking events and it is really needed to focus on the accurate analysis of the building, including all possible accuracy in the analysis. Apart from superstructure geometry, the soil behavior during earthquake shaking plays a pivotal role in the building collapse (Chopra, 2012). Fixed base analysis where the soil is considered to be infinitely rigid cannot simulate the actual scenario of wave propagation during earthquakes and wave transfer mechanism in the superstructure (Wolf, 1985). This can be well explained in the soil structure interaction analysis, where the ground movement and structural movement can be considered with the equal rigor. In the present study the object oriented program has been developed in C++ to model the SSI system using the finite element methodology. In this attempt the seismic soil structure interaction analysis has been carried out for T, L and C types piled raft supported buildings in the recent 25th April 2015 Nepal earthquake (M = 7.8). The soil properties have been considered with the appropriate soil data from the Katmandu valley region. The effect of asymmetry of the building on the responses of the superstructure is compared with the author's research work. It has been studied/observed that the shape or geometry of the superstructure governs the response of the superstructure subjected to the same earthquake load.

  13. E-ELT seismic devices analysis and prototype testing

    NASA Astrophysics Data System (ADS)

    Gómez, Celia; Avilés, Alexander; Bilbao, Armando; Siepe, Daniel; Nawrotzki, Peter

    2012-09-01

    During the E-ELT Dome and Foundations FEED Study, IDOM developed a Base Control System for protection of the E-ELT Main Structure against the effect of high level earthquakes. The proposed design was aimed to provide an effective isolation during heavy seismic events, whereas in normal observation conditions it presented a high stiffness to avoid interferences with the pointing accuracy of the telescope. In a subsequent phase, a representative prototype was envisaged by IDOM, in close collaboration with GERB, to evaluate the performance of this system, correlate the results from prototype testing with the behaviour predicted by a calculation model and finally validate the design conceived during the FEED Study. The assessment of the results from the prototype tests has been focused on checking the level of compliance with the demanded requirements: 1) the Base Control System isolates the upper structure from ground in case of high magnitude seismic events; 2) in operational conditions, the system -by means of Preloaded Devices (PLDs)- provides a stiff interface with the ground; 3) regarding the performance of the PLDs, the finite element model simulates accurately the non-linear behaviour, particularly the zero crossing when the direction of the excitation changes; 4) there is no degradation of the stiffness properties of the seismic devices, after being submitted to a heavy seismic event. The prototype was manufactured by GERB and pseudo-integrated tests were performed on a shaking table at the premises of the Institute of Earthquake Engineering (IZIIS) in Skopje, Macedonia.

  14. Optimization Strategies for the Vulnerability Analysis of the Electric Power Grid

    SciTech Connect

    Pinar, A.; Meza, J.; Donde, V.; Lesieutre, B.

    2007-11-13

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (MINLP) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  15. Optimization strategies for the vulnerability analysis of the electric power grid.

    SciTech Connect

    Meza, Juan C.; Pinar, Ali; Lesieutre, Bernard; Donde, Vaibhav

    2009-03-01

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (minlp) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  16. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SUMMARY OF COMBINED THERMAL AND OPERATING LOADS WITH SEISMIC ANALYSIS

    SciTech Connect

    MACKEY TC; DEIBLER JE; RINKER MW; JOHNSON KI; ABATT FG; KARRI NK; PILLI SP; STOOPS KL

    2009-01-15

    This report summarizes the results of the Double-Shell Tank Thermal and Operating Loads Analysis (TaLA) combined with the Seismic Analysis. This combined analysis provides a thorough, defensible, and documented analysis that will become a part of the overall analysis of record for the Hanford double-shell tanks (DSTs). The bases of the analytical work presented herein are two ANSYS{reg_sign} finite element models that were developed to represent a bounding-case tank. The TaLA model includes the effects of temperature on material properties, creep, concrete cracking, and various waste and annulus pressure-loading conditions. The seismic model considers the interaction of the tanks with the surrounding soil including a range of soil properties, and the effects of the waste contents during a seismic event. The structural evaluations completed with the representative tank models do not reveal any structural deficiencies with the integrity of the DSTs. The analyses represent 60 years of use, which extends well beyond the current date. In addition, the temperature loads imposed on the model are significantly more severe than any service to date or proposed for the future. Bounding material properties were also selected to provide the most severe combinations. While the focus of the analyses was a bounding-case tank, it was necessary during various evaluations to conduct tank-specific analyses. The primary tank buckling evaluation was carried out on a tank-specific basis because of the sensitivity to waste height, specific gravity, tank wall thickness, and primary tank vapor space vacuum limit. For this analysis, the occurrence of maximum tank vacuum was classified as a service level C, emergency load condition. The only area of potential concern in the analysis was with the buckling evaluation of the AP tank, which showed the current limit on demand of l2-inch water gauge vacuum to exceed the allowable of 10.4 inches. This determination was based on analysis at the

  17. Singular spectral analysis based filtering of seismic signal using new Weighted Eigen Spectrogram

    NASA Astrophysics Data System (ADS)

    Rekapalli, Rajesh; Tiwari, R. K.

    2016-09-01

    Filtering of non-stationary noisy seismic signals using the fixed basis functions (sine and cosine) generates artifacts in the final output and thereby leads to wrong interpretation. In order to circumvent the problem, we propose here, a new Weighted Eigen Spectrogram (WES) based robust time domain Singular Spectrum Analysis (SSA) frequency filtering algorithm. The new WES is used to simplify the Eigen triplet grouping procedure in SSA. We tested the robustness of the algorithm on synthetic seismic data assorted with field-simulated noise. Then we applied the method to filter the high-resolution seismic reflection field data. The band pass filtering of noisy seismic records suggests that the underlying algorithm is efficient for improving the signal to noise ratio (S/N) and also it is user-friendly.

  18. Genetic analysis reveals demographic fragmentation of grizzly bears yielding vulnerably small populations.

    PubMed

    Proctor, Michael F; McLellan, Bruce N; Strobeck, Curtis; Barclay, Robert M R

    2005-11-22

    Ecosystem conservation requires the presence of native carnivores, yet in North America, the distributions of many larger carnivores have contracted. Large carnivores live at low densities and require large areas to thrive at the population level. Therefore, if human-dominated landscapes fragment remaining carnivore populations, small and demographically vulnerable populations may result. Grizzly bear range contraction in the conterminous USA has left four fragmented populations, three of which remain along the Canada-USA border. A tenet of grizzly bear conservation is that the viability of these populations requires demographic linkage (i.e. inter-population movement of both sexes) to Canadian bears. Using individual-based genetic analysis, our results suggest this demographic connection has been severed across their entire range in southern Canada by a highway and associated settlements, limiting female and reducing male movement. Two resulting populations are vulnerably small (< or =100 animals) and one of these is completely isolated. Our results suggest that these trans-border bear populations may be more threatened than previously thought and that conservation efforts must expand to include international connectivity management. They also demonstrate the ability of genetic analysis to detect gender-specific demographic population fragmentation in recently disturbed systems, a traditionally intractable yet increasingly important ecological measurement worldwide.

  19. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  20. Modal seismic analysis of a nuclear power plant control panel and comparison with SAP 4

    NASA Technical Reports Server (NTRS)

    Pamidi, M. R.; Pamidi, P. R.

    1976-01-01

    The application of NASTRAN to seismic analysis by considering the example of a nuclear power plant control panel was considered. A modal analysis of a three-dimensional model of the panel, consisting of beam and quadri-lateral membrane elements, is performed. Using the results of this analysis and a typical response spectrum of an earthquake, the seismic response of the structure is obtained. ALTERs required to the program in order to compute the maximum modal responses as well as the resultant response are given. The results are compared with those obtained by using the SAP IV computer program.

  1. An interdisciplinary perspective on social and physical determinants of seismic risk

    NASA Astrophysics Data System (ADS)

    Lin, K.-H. E.; Chang, Y.-C.; Liu, G.-Y.; Chan, C.-H.; Lin, T.-H.; Yeh, C.-H.

    2015-10-01

    While disaster studies researchers usually view risk as a function of hazard, exposure, and vulnerability, few studies have systematically examined the relationships among the various physical and socioeconomic determinants underlying disasters, and fewer have done so through seismic risk analysis. In the context of the 1999 Chi-Chi earthquake in Taiwan, this study constructs three statistical models to test different determinants that affect disaster fatality at the village level, including seismic hazard, exposure of population and fragile buildings, and demographic and socioeconomic vulnerability. The Poisson regression model is used to estimate the impact of these factors on fatalities. Research results indicate that although all of the determinants have an impact on seismic fatality, some indicators of vulnerability, such as gender ratio, percentages of young and aged population, income and its standard deviation, are the important determinants deteriorating seismic risk. These findings have strong social implications for policy interventions to mitigate such disasters.

  2. Vulnerability of Buildings for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Vepkhvadze, Sopio; Arabidze, Vakhtang; Arevadze, Nika; Mukhadze, Temur; Jangveladze, Shota

    2013-04-01

    The risk always exists when cities are built on. Population growth in cities and urbanization in seismic-prone zones leads to infrastructure expansion. The goal of the society is to construct earthquake resistant infrastructure and minimize the expected losses. Work presented here was initiated by working package wp5 of regional projects EMME (Earthquake Model for Middle East Region). The primary scientific objective of this work was to combine analysis of the contemporary elements at risk inventories, seismicity and vulnerability to assess seismic hazard and seismic risk for the 0capital of Georgia - Tbilisi. Creation data bases (inventory) of elements at risk (building population) in GIS system were the first step of this work. Creation inventory databases are based on two approaches. One is monitoring and second is analyses of photos and aerial photos by expert. During the monitoring were realized that we have many cases of roof types, materials and functionality. For roof type, materials and functionality special program was prepared in GIS that allow manually create these ones in databases and then assigned to the building. Depending the choice of these ones, the program automatically assigned code to the building, finely on the bases of this codes program will be prepared that automatically calculate the taxonomy of the building. The European building taxonomy classification proposed in Giovinazzi (2005) were used for these building and taxonomy classification was done. On the bases of empirical data that was collected for Racha earthquake (Ms = 6.9) on 29 April of 1991 and Tbilisi earthquake (Ms= 4.5) on 25 April of 2002 some intensity based vulnerability study were completed and the regional vulnerability factors were developed for these typologies.

  3. Discrimination between induced and natural seismicity by means of nonlinear analysis

    NASA Astrophysics Data System (ADS)

    Turuntaev, S. B.; Melchaeva, O. Yu.; Vorohobina, S. V.

    2012-04-01

    Uch-Terek Rivers in Kyrgyzstan; (3) the seismicity in the region of the Geysers geothermal complex in California, US; (4) the seismicity in the region of Bishkek geophysical test site, Kyrgyzstan, recorded before and after strong electromagnetic discharges. The nonlinear analysis of the data sets on seismicity showed that technogeneous action on the geophysical medium increases the regularity of the seismic regime. It looks like the formation of stable states characterized by a finite fractal dimension of the attractor and reasonable small dimension of the embedding space. The presence of the stable states opens the possibility of forecasting the development of induced seismic activity. We also present the results of nonlinear analysis of the rate-and-state model, which allows us to describe the mechanics of the studied phenomenon. In this context, the model of motion in the fault zones that obey the two-parameters friction law suggests that if the external action causes the critical stresses to decrease e.g. due to the growth of the pore pressure or due to heating of the fault zone, we should expect the deterministic component of the seismic process to increase.

  4. Caucasus Seismic Information Network: Data and Analysis Final Report

    SciTech Connect

    Randolph Martin; Mary Krasovec; Spring Romer; Timothy O'Connor; Emanuel G. Bombolakis; Youshun Sun; Nafi Toksoz

    2007-02-22

    The geology and tectonics of the Caucasus region (Armenia, Azerbaijan, and Georgia) are highly variable. Consequently, generating a structural model and characterizing seismic wave propagation in the region require data from local seismic networks. As of eight years ago, there was only one broadband digital station operating in the region – an IRIS station at Garni, Armenia – and few analog stations. The Caucasus Seismic Information Network (CauSIN) project is part of a nulti-national effort to build a knowledge base of seismicity and tectonics in the region. During this project, three major tasks were completed: 1) collection of seismic data, both in event catalogus and phase arrival time picks; 2) development of a 3-D P-wave velocity model of the region obtained through crustal tomography; 3) advances in geological and tectonic models of the region. The first two tasks are interrelated. A large suite of historical and recent seismic data were collected for the Caucasus. These data were mainly analog prior to 2000, and more recently, in Georgia and Azerbaijan, the data are digital. Based on the most reliable data from regional networks, a crustal model was developed using 3-D tomographic inversion. The results of the inversion are presented, and the supporting seismic data are reported. The third task was carried out on several fronts. Geologically, the goal of obtaining an integrated geological map of the Caucasus on a scale of 1:500,000 was initiated. The map for Georgia has been completed. This map serves as a guide for the final incorporation of the data from Armenia and Azerbaijan. Description of the geological units across borders has been worked out and formation boundaries across borders have been agreed upon. Currently, Armenia and Azerbaijan are working with scientists in Georgia to complete this task. The successful integration of the geologic data also required addressing and mapping active faults throughout the greater Caucasus. Each of the major

  5. Analysis of vulnerability factors that control nitrate occurrence in natural springs (Osona Region, NE Spain).

    PubMed

    Menció, Anna; Boy, Mercè; Mas-Pla, Josep

    2011-07-15

    Nitrate pollution is one of the main concerns of groundwater management in most of the world's agricultural areas. In the Osona region of NE Spain, high concentrations of nitrates have been reported in wells. This study uses the occurrence of this pollutant in natural springs as an indicator of the sub-surface dynamics of the water cycle and shows how groundwater quality is affected by crop fertilization, as an approach to determine the aquifer vulnerability. Nitrate concentration and other hydrochemical parameters based on a biannual database are reported for approximately 80 springs for the period 2004-2009. The background concentration of nitrate is first determined to distinguish polluted areas from natural nitrate occurrence. A statistical treatment using logistic regression and ANOVA is then performed to identify the significance of the effect of vulnerability factors such as the geological setting of the springs, land use in recharge areas, sampling periods, and chemical parameters like pH and EC, on groundwater nitrate pollution. The results of the analysis identify a threshold value of 7-8 mg NO(3)(-)/L for nitrate pollution in this area. Logistic regression and ANOVA results show that an increase in EC or a decrease in pH values is linked to the possibility of higher nitrate concentrations in springs. These analyses also show that nitrate pollution is more dependent on land use than the geological setting of springs or sampling periods. Indeed, the specific geological and soil features of the uppermost layers in their recharge areas do not contribute to the buffering of nitrate impacts on aquifers as measured in natural springs. Land use, and particularly fertilization practices, are major factors in groundwater vulnerability.

  6. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  7. Slope Stability Analysis In Seismic Areas Of The Northern Apennines (Italy)

    SciTech Connect

    Lo Presti, D.; Fontana, T.; Marchetti, D.

    2008-07-08

    Several research works have been published on the slope stability in the northern Tuscany (central Italy) and particularly in the seismic areas of Garfagnana and Lunigiana (Lucca and Massa-Carrara districts), aimed at analysing the slope stability under static and dynamic conditions and mapping the landslide hazard. In addition, in situ and laboratory investigations are available for the study area, thanks to the activities undertaken by the Tuscany Seismic Survey. Based on such a huge information the co-seismic stability of few ideal slope profiles have been analysed by means of Limit equilibrium method LEM - (pseudo-static) and Newmark sliding block analysis (pseudo-dynamic). The analysis--results gave indications about the most appropriate seismic coefficient to be used in pseudo-static analysis after establishing allowable permanent displacement. Such indications are commented in the light of the Italian and European prescriptions for seismic stability analysis with pseudo-static approach. The stability conditions, obtained from the previous analyses, could be used to define microzonation criteria for the study area.

  8. Magma intrusion near Volcan Tancitaro: Evidence from seismic analysis

    SciTech Connect

    Pinzon, Juan I.; Nunez-Cornu, Francisco J.; Rowe, Charlotte Anne

    2016-11-17

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ~1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. We used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9–10 km and 3–4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ~5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. As a result, these features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.

  9. Synergy of seismic, acoustic, and video signals in blast analysis

    SciTech Connect

    Anderson, D.P.; Stump, B.W.; Weigand, J.

    1997-09-01

    The range of mining applications from hard rock quarrying to coal exposure to mineral recovery leads to a great variety of blasting practices. A common characteristic of many of the sources is that they are detonated at or near the earth`s surface and thus can be recorded by camera or video. Although the primary interest is in the seismic waveforms that these blasts generate, the visual observations of the blasts provide important constraints that can be applied to the physical interpretation of the seismic source function. In particular, high speed images can provide information on detonation times of individuals charges, the timing and amount of mass movement during the blasting process and, in some instances, evidence of wave propagation away from the source. All of these characteristics can be valuable in interpreting the equivalent seismic source function for a set of mine explosions and quantifying the relative importance of the different processes. This paper documents work done at the Los Alamos National Laboratory and Southern Methodist University to take standard Hi-8 video of mine blasts, recover digital images from them, and combine them with ground motion records for interpretation. The steps in the data acquisition, processing, display, and interpretation are outlined. The authors conclude that the combination of video with seismic and acoustic signals can be a powerful diagnostic tool for the study of blasting techniques and seismology. A low cost system for generating similar diagnostics using consumer-grade video camera and direct-to-disk video hardware is proposed. Application is to verification of the Comprehensive Test Ban Treaty.

  10. Magma intrusion near Volcan Tancitaro: Evidence from seismic analysis

    DOE PAGES

    Pinzon, Juan I.; Nunez-Cornu, Francisco J.; Rowe, Charlotte Anne

    2016-11-17

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ~1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. Wemore » used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9–10 km and 3–4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ~5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. As a result, these features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.« less

  11. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  12. Numerical simulation of bubble plumes and an analysis of their seismic attributes

    NASA Astrophysics Data System (ADS)

    Li, Canping; Gou, Limin; You, Jiachun

    2017-04-01

    To study the bubble plume's seismic response characteristics, the model of a plume water body has been built in this article using the bubble-contained medium acoustic velocity model and the stochastic medium theory based on an analysis of both the acoustic characteristics of a bubble-contained water body and the actual features of a plume. The finite difference method is used for forward modelling, and the single-shot seismic record exhibits the characteristics of a scattered wave field generated by a plume. A meaningful conclusion is obtained by extracting seismic attributes from the pre-stack shot gather record of a plume. The values of the amplitude-related seismic attributes increase greatly as the bubble content goes up, and changes in bubble radius will not cause seismic attributes to change, which is primarily observed because the bubble content has a strong impact on the plume's acoustic velocity, while the bubble radius has a weak impact on the acoustic velocity. The above conclusion provides a theoretical reference for identifying hydrate plumes using seismic methods and contributes to further study on hydrate decomposition and migration, as well as on distribution of the methane bubble in seawater.

  13. Spatiotemporal sequence of Himalayan debris flow from analysis of high-frequency seismic noise

    NASA Astrophysics Data System (ADS)

    Burtin, A.; Bollinger, L.; Cattin, R.; Vergne, J.; Nábělek, J. L.

    2009-10-01

    During the 2003 summer monsoon, the Hi-CLIMB seismological stations deployed across the Himalayan Range detected bursts of high-frequency seismic noise that lasted several hours to days. On the basis of the cross correlation of seismic envelopes recorded at 11 stations, we show that the largest transient event on 15 August was located nearby a village partially destroyed on that day by a devastating debris flow. This consistency in both space and time suggests that high-frequency seismic noise analysis can be used to monitor debris flow generation as well as the evacuation of the sediment. A systematic study of one year of seismic noise, focusing on the detection of similar events, provides information on the spatial and temporal occurrence of mass movements at the front of the Himalayas. With a 50% probability of occurrence of a daily event, a total of 46 debris flows are seismically detected. Most of them were generated in regions of steep slopes, large gullies, and loose soils during the 2003 summer monsoon storms. These events are compared to local meteorological data to determine rainfall thresholds for slope failures, including the cumulative rainfall needed to bring the soil moisture content to failure capacity. The inferred thresholds are consistent with previous estimates deduced from soil studies as well as sediment supply investigations in the area. These results point out the potential of using seismic noise as a dedicated tool for monitoring the spatiotemporal occurrence of landslides and debris flows on a regional scale.

  14. Region-specific deterministic and probabilistic seismic hazard analysis of Kanpur city

    NASA Astrophysics Data System (ADS)

    P, Anbazhagan; Bajaj, Ketan; Dutta, Nairwita; R Moustafa, Sayed S.; N Al-Arifi, Nassir S.

    2017-02-01

    A seismic hazard map of Kanpur city has been developed considering the region-specific seismotectonic parameters within a 500-km radius by deterministic and probabilistic approaches. The maximum probable earthquake magnitude ( M max) for each seismic source has been estimated by considering the regional rupture characteristics method and has been compared with the maximum magnitude observed ({M_{max }^{ {obs}}} ), M_{max }^{ {obs}} +0.5 and Kijko method. The best suitable ground motion prediction equations (GMPE) were selected from 27 applicable GMPEs based on the `efficacy test'. Furthermore, different weight factors were assigned to different M max values and the selected GMPE to calculate the final hazard value. Peak ground acceleration and spectral acceleration at 0.2 and 1 s were estimated and mapped for worst-case scenario and 2 and 10% probability of exceedance for 50 years. Peak ground acceleration (PGA) showed a variation from 0.04 to 0.36 g for DSHA, from 0.02 to 0.32 g and 0.092 to 0.1525 g for 2 and 10% probability in 50 years, respectively. A normalised site-specific design spectrum has been developed considering three vulnerable sources based on deaggregation at the city center and the results are compared with the recent 2011 Sikkim and 2015 Nepal earthquakes, and the Indian seismic code IS 1893.

  15. Seismic fragility analysis of typical pre-1990 bridges due to near- and far-field ground motions

    NASA Astrophysics Data System (ADS)

    Mosleh, Araliya; Razzaghi, Mehran S.; Jara, José; Varum, Humberto

    2016-03-01

    Bridge damages during the past earthquakes caused several physical and economic impacts to transportation systems. Many of the existing bridges in earthquake prone areas are pre-1990 bridges and were designed with out of date regulation codes. The occurrences of strong motions in different parts of the world show every year the vulnerability of these structures. Nonlinear dynamic time history analyses were conducted to assess the seismic vulnerability of typical pre-1990 bridges. A family of existing concrete bridge representative of the most common bridges in the highway system in Iran is studied. The seismic demand consists in a set of far-field and near-field strong motions to evaluate the likelihood of exceeding the seismic capacity of the mentioned bridges. The peak ground accelerations (PGAs) were scaled and applied incrementally to the 3D models to evaluate the seismic performance of the bridges. The superstructure was assumed to remain elastic and the nonlinear behavior in piers was modeled by assigning plastic hinges in columns. In this study the displacement ductility and the PGA are selected as a seismic performance indicator and intensity measure, respectively. The results show that pre-1990 bridges subjected to near-fault ground motions reach minor and moderate damage states.

  16. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed

  17. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  18. Preliminary Analysis of Remote Triggered Seismicity in Northern Baja California Generated by the 2011, Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Wong-Ortega, V.; Castro, R. R.; Gonzalez-Huizar, H.; Velasco, A. A.

    2013-05-01

    We analyze possible variations of seismicity in the northern Baja California due to the passage of seismic waves from the 2011, M9.0, Tohoku-Oki, Japan earthquake. The northwestern area of Baja California is characterized by a mountain range composed of crystalline rocks. These Peninsular Ranges of Baja California exhibits high microseismic activity and moderate size earthquakes. In the eastern region of Baja California shearing between the Pacific and the North American plates takes place and the Imperial and Cerro-Prieto faults generate most of the seismicity. The seismicity in these regions is monitored by the seismic network RESNOM operated by the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE). This network consists of 13 three-component seismic stations. We use the seismic catalog of RESNOM to search for changes in local seismic rates occurred after the passing of surface waves generated by the Tohoku-Oki, Japan earthquake. When we compare one month of seismicity before and after the M9.0 earthquake, the preliminary analysis shows absence of triggered seismicity in the northern Peninsular Ranges and an increase of seismicity south of the Mexicali valley where the Imperial fault jumps southwest and the Cerro Prieto fault continues.

  19. Tracking Socioeconomic Vulnerability Using Network Analysis: Insights from an Avian Influenza Outbreak in an Ostrich Production Network

    PubMed Central

    Moore, Christine; Cumming, Graeme S.; Slingsby, Jasper; Grewar, John

    2014-01-01

    Background The focus of management in many complex systems is shifting towards facilitation, adaptation, building resilience, and reducing vulnerability. Resilience management requires the development and application of general heuristics and methods for tracking changes in both resilience and vulnerability. We explored the emergence of vulnerability in the South African domestic ostrich industry, an animal production system which typically involves 3–4 movements of each bird during its lifetime. This system has experienced several disease outbreaks, and the aim of this study was to investigate whether these movements have contributed to the vulnerability of this system to large disease outbreaks. Methodology/Principal Findings The ostrich production system requires numerous movements of birds between different farm types associated with growth (i.e. Hatchery to juvenile rearing farm to adult rearing farm). We used 5 years of movement records between 2005 and 2011 prior to an outbreak of Highly Pathogenic Avian Influenza (H5N2). These data were analyzed using a network analysis in which the farms were represented as nodes and the movements of birds as links. We tested the hypothesis that increasing economic efficiency in the domestic ostrich industry in South Africa made the system more vulnerable to outbreak of Highly Pathogenic Avian Influenza (H5N2). Our results indicated that as time progressed, the network became increasingly vulnerable to pathogen outbreaks. The farms that became infected during the outbreak displayed network qualities, such as significantly higher connectivity and centrality, which predisposed them to be more vulnerable to disease outbreak. Conclusions/Significance Taken in the context of previous research, our results provide strong support for the application of network analysis to track vulnerability, while also providing useful practical implications for system monitoring and management. PMID:24498004

  20. Vulnerabilities to Rock-Slope Failure Impacts from Christchurch, NZ Case History Analysis

    NASA Astrophysics Data System (ADS)

    Grant, A.; Wartman, J.; Massey, C. I.; Olsen, M. J.; Motley, M. R.; Hanson, D.; Henderson, J.

    2015-12-01

    Rock-slope failures during the 2010/11 Canterbury (Christchurch), New Zealand Earthquake Sequence resulted in 5 fatalities and caused an estimated US$400 million of damage to buildings and infrastructure. Reducing losses from rock-slope failures requires consideration of both hazard (i.e. likelihood of occurrence) and risk (i.e. likelihood of losses given an occurrence). Risk assessment thus requires information on the vulnerability of structures to rock or boulder impacts. Here we present 32 case histories of structures impacted by boulders triggered during the 2010/11 Canterbury earthquake sequence, in the Port Hills region of Christchurch, New Zealand. The consequences of rock fall impacts on structures, taken as penetration distance into structures, are shown to follow a power-law distribution with impact energy. Detailed mapping of rock fall sources and paths from field mapping, aerial lidar digital elevation model (DEM) data, and high-resolution aerial imagery produced 32 well-constrained runout paths of boulders that impacted structures. Impact velocities used for structural analysis were developed using lumped mass 2-D rock fall runout models using 1-m resolution lidar elevation data. Model inputs were based on calibrated surface parameters from mapped runout paths of 198 additional boulder runouts. Terrestrial lidar scans and structure from motion (SfM) imagery generated 3-D point cloud data used to measure structural damage and impacting boulders. Combining velocity distributions from 2-D analysis and high-precision boulder dimensions, kinetic energy distributions were calculated for all impacts. Calculated impact energy versus penetration distance for all cases suggests a power-law relationship between damage and impact energy. These case histories and resulting fragility curve should serve as a foundation for future risk analysis of rock fall hazards by linking vulnerability data to the predicted energy distributions from the hazard analysis.

  1. Real time magma transport imaging and earthquake localization using seismic amplitude ratio analysis

    NASA Astrophysics Data System (ADS)

    Taisne, B.; Brenguier, F.; Nercessian, A.; Beauducel, F.; Smith, P. J.

    2011-12-01

    Seismic amplitude ratio analysis (SARA) has been used successfully to track the sub-surface migration of magma prior to an eruption at Piton de la Fournaise volcano, La Réunion. The methodology is based on the temporal analysis of the seismic amplitude ratio between different pairs of stations, along with a model of seismic wave attenuation. This method has already highlighted the complexity of magma migration in the shallower part of the volcanic edifice during a seismic crisis using continuous records. We will see that this method can also be applied to the localization of individual earthquakes triggered by monitoring systems, prior to human intervention such as phase picking. As examples, the analysis is performed on two kinds of seismic events observed at Soufrière Hills Volcano, Montserrat during the last 15 years, namely: Hybrids events and Volcano-Tectonic earthquakes. Finally, we present the implementation of a fully automatic SARA method for monitoring of Piton de la Fournaise volcano using continuous data in real-time.

  2. Low carbon technology performance vs infrastructure vulnerability: analysis through the local and global properties space.

    PubMed

    Dawson, David A; Purnell, Phil; Roelich, Katy; Busch, Jonathan; Steinberger, Julia K

    2014-11-04

    Renewable energy technologies, necessary for low-carbon infrastructure networks, are being adopted to help reduce fossil fuel dependence and meet carbon mitigation targets. The evolution of these technologies has progressed based on the enhancement of technology-specific performance criteria, without explicitly considering the wider system (global) impacts. This paper presents a methodology for simultaneously assessing local (technology) and global (infrastructure) performance, allowing key technological interventions to be evaluated with respect to their effect on the vulnerability of wider infrastructure systems. We use exposure of low carbon infrastructure to critical material supply disruption (criticality) to demonstrate the methodology. A series of local performance changes are analyzed; and by extension of this approach, a method for assessing the combined criticality of multiple materials for one specific technology is proposed. Via a case study of wind turbines at both the material (magnets) and technology (turbine generators) levels, we demonstrate that analysis of a given intervention at different levels can lead to differing conclusions regarding the effect on vulnerability. Infrastructure design decisions should take a systemic approach; without these multilevel considerations, strategic goals aimed to help meet low-carbon targets, that is, through long-term infrastructure transitions, could be significantly jeopardized.

  3. Arctic indigenous youth resilience and vulnerability: comparative analysis of adolescent experiences across five circumpolar communities.

    PubMed

    Ulturgasheva, Olga; Rasmus, Stacy; Wexler, Lisa; Nystad, Kristine; Kral, Michael

    2014-10-01

    Arctic peoples today find themselves on the front line of rapid environmental change brought about by globalizing forces, shifting climates, and destabilizing physical conditions. The weather is not the only thing undergoing rapid change here. Social climates are intrinsically connected to physical climates, and changes within each have profound effects on the daily life, health, and well-being of circumpolar indigenous peoples. This paper describes a collaborative effort between university researchers and community members from five indigenous communities in the circumpolar north aimed at comparing the experiences of indigenous Arctic youth in order to come up with a shared model of indigenous youth resilience. The discussion introduces a sliding scale model that emerged from the comparative data analysis. It illustrates how a "sliding scale" of resilience captures the inherent dynamism of youth strategies for "doing well" and what forces represent positive and negative influences that slide towards either personal and communal resilience or vulnerability. The model of the sliding scale is designed to reflect the contingency and interdependence of resilience and vulnerability and their fluctuations between lowest and highest points based on timing, local situation, larger context, and meaning.

  4. Socio-geographical factors in vulnerability to dengue in Thai villages: a spatial regression analysis.

    PubMed

    Tipayamongkholgul, Mathuros; Lisakulruk, Sunisa

    2011-05-01

    Focusing on the socio-geographical factors that influence local vulnerability to dengue at the village level, spatial regression methods were applied to analyse, over a 5-year period, the village-specific, cumulative incidence of all reported dengue cases among 437 villages in Prachuap Khiri Khan, a semi-urban province of Thailand. The K-order nearest neighbour method was used to define the range of neighbourhoods. Analysis showed a significant neighbourhood effect (ρ = 0.405, P <0.001), which implies that villages with geographical proximity shared a similar level of vulnerability to dengue. The two independent social factors, associated with a higher incidence of dengue, were a shorter distance to the nearest urban area (β = -0.133, P <0.05) and a smaller average family size (β = -0.102, P <0.05). These results indicate that the trend of increasing dengue occurrence in rural Thailand arose in areas under stronger urban influence rather than in remote rural areas.

  5. Use of cartography in historical seismicity analysis: a reliable tool to better apprehend the contextualization of the historical documents

    NASA Astrophysics Data System (ADS)

    Thibault, Fradet; Grégory, Quenet; Kevin, Manchuel

    2014-05-01

    Historical studies, including historical seismicity analysis, deal with historical documents. Numerous factors, such as culture, social condition, demography, political situations and opinions or religious ones influence the way the events are transcribed in the archives. As a consequence, it is crucial to contextualize and compare the historical documents reporting on a given event in order to reduce the uncertainties affecting their analysis and interpretation. When studying historical seismic events it is often tricky to have a global view of all the information provided by the historical documents. It is also difficult to extract cross-correlated information from the documents and draw a precise historical context. Use of cartographic and geographic tools in GIS software is the best tool for the synthesis, interpretation and contextualization of the historical material. The main goal is to produce the most complete dataset of available information, in order to take into account all the components of the historical context and consequently improve the macroseismic analysis. The Entre-Deux-Mers earthquake (1759, Iepc= VII-VIII) [SISFRANCE 2013 - EDF-IRSN-BRGM] is well documented but has never benefited from a cross-analysis of historical documents and historical context elements. The map of available intensity data from SISFRANCE highlights a gap in macroseismic information within the estimated epicentral area. The aim of this study is to understand the origin of this gap by making a cartographic compilation of both, archive information and historical context elements. The results support the hypothesis that the lack of documents and macroseismic data in the epicentral area is related to a low human activity rather than low seismic effects in this zone. Topographic features, geographical position, flood hazard, roads and pathways locations, vineyards distribution and the forester coverage, mentioned in the archives and reported on the Cassini's map confirm this

  6. Statistical analysis in the Natural Time Domain of the seismicity in México

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, A.; Moreno-Torres, L. R.; Flores-Marquez, E. L.

    2012-04-01

    This work deals in a statistical analysis performed in the natural time domain of the seismicity occurred in México within the period 2000-2011. The data set corresponds to the seismic activity recorded by the National Seismological Service (SSN). Our study is performed along the Mexican Pacific coast comprising the states of: Baja California, Jalisco, Michoacán, Guerrero, and Oaxaca. The preliminary results of the analysis of power spectrum, order parameter and entropy fluctuations in the natural time domain show good consistency with the natural time theory and the latest tectonic reported findings; this fact confirms the suspicions of different tectonic mechanisms as seism trigger.

  7. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM.

    SciTech Connect

    DEGRASSI,G.; HOFMAYER,C.; MURPHY,C.; SUZUKI,K.; NAMITA,Y.

    2003-08-17

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper.

  8. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  9. Dynamics of the Bingham Canyon Mine landslides from seismic signal analysis

    NASA Astrophysics Data System (ADS)

    Hibert, Clément; Ekström, Göran; Stark, Colin P.

    2014-07-01

    Joint interpretation of long- and short-period seismic signals generated by landslides sheds light on the dynamics of slope failure, providing constraints on landslide initiation and termination and on the main phases of acceleration and deceleration. We carry out a combined analysis of the seismic signals generated by two massive landslides that struck the Bingham Canyon Mine pit on 10 April 2013. Inversion of the long-period waveforms yields time series for the bulk landslide forces and momenta, from which we deduce runout trajectories consistent with the deposit morphology. Comparing these time series with the short-period seismic data, we are able to infer when and where major changes take place in landslide momentum along the runout path. This combined analysis points to a progressive fracturing of the masses during acceleration indicates that deceleration starts the moment they reach the pit floor and suggests that the bulk movement is stopped by a topographic barrier.

  10. Statistical analysis and modeling of seismicity related to the exploitation of geothermal energy

    NASA Astrophysics Data System (ADS)

    Dinske, Carsten; Langenbruch, Cornelius; Shapiro, Serge

    2016-04-01

    catalogs of the considered reservoirs contain approximately 50 per cent of the number of events in the original catalogs. Furthermore, we perform ETAS modeling (Epidemic Type Aftershock model, Ogata, 1985,1988) for two reasons. First, we want to understand if the different reservoirs are also comparable in the earthquake interaction patterns and hence in the aftershock triggering following larger magnitude induced events. Second, if we identify systematic patterns, the ETAS modeling can contribute to the forecast and consequently to the mitigation of seismicity during production of geothermal energy. We find that stationary ETAS models can not accurately capture the observed seismicity rate changes. One reason for this finding is given by the rate of induced events (or the back-ground activity in the ETAS model) which is not constant with time. Therefore we apply non-stationary ETAS modeling which results in a good agreement between observation and model. However, the needed non-stationarity in the process complicates the application of ETAS modeling for the forecast of seismicity during production. Thus, its implementation in so-called traffic-light-systems for the mitigation of possible seismic hazard requires further detailed analysis.

  11. Seismic fragility evaluation of a piping system in a nuclear power plant by shaking table test and numerical analysis

    SciTech Connect

    Kim, M. K.; Kim, J. H.; Choi, I. K.

    2012-07-01

    In this study, a seismic fragility evaluation of the piping system in a nuclear power plant was performed. For the evaluation of seismic fragility of the piping system, this research was progressed as three steps. At first, several piping element capacity tests were performed. The monotonic and cyclic loading tests were conducted under the same internal pressure level of actual nuclear power plants to evaluate the performance. The cracks and wall thinning were considered as degradation factors of the piping system. Second, a shaking tale test was performed for an evaluation of seismic capacity of a selected piping system. The multi-support seismic excitation was performed for the considering a difference of an elevation of support. Finally, a numerical analysis was performed for the assessment of seismic fragility of piping system. As a result, a seismic fragility for piping system of NPP in Korea by using a shaking table test and numerical analysis. (authors)

  12. Fuzzy decision analysis for integrated environmental vulnerability assessment of the mid-Atlantic Region.

    PubMed

    Tran, Liem T; Knight, C Gregory; O'Neill, Robert V; Smith, Elizabeth R; Riitters, Kurt H; Wickham, James

    2002-06-01

    A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams, air pollution, and topography of the Mid-Atlantic region, we were able to point out areas that were in relatively poor condition and/or vulnerable to future deterioration. The method offered an easy and comprehensive way to combine the strengths of fuzzy set theory and the AHP for ecological assessment. Furthermore, the suggested method can serve as a building block for the evaluation of environmental policies.

  13. Analysis of bathymetric surveys to identify coastal vulnerabilities at Cape Canaveral, Florida

    USGS Publications Warehouse

    Thompson, David M.; Plant, Nathaniel G.; Hansen, Mark E.

    2015-10-07

    The purpose of this work is to describe an updated bathymetric dataset collected in 2014 and compare it to previous datasets. The updated data focus on the bathymetric features and sediment transport pathways that connect the offshore regions to the shoreline and, therefore, are related to the protection of other portions of the coastal environment, such as dunes, that support infrastructure and ecosystems. Previous survey data include National Oceanic and Atmospheric Administration’s (NOAA) National Ocean Service (NOS) hydrographic survey from 1956 and a USGS survey from 2010 that is augmented with NOS surveys from 2006 and 2007. The primary result of this analysis is documentation and quantification of the nature and rates of bathymetric changes that are near (within about 2.5 km) the current Cape Canaveral shoreline and interpretation of the impact of these changes on future erosion vulnerability.

  14. Analysis of the seismicity in the region of Mirovo salt mine after 8 years monitoring

    NASA Astrophysics Data System (ADS)

    Dimitrova, Liliya; Solakov, Dimcho; Simeonova, Stela; Aleksandrova, Irena; Georgieva, Gergana

    2015-04-01

    Mirovo salt deposit is situated in the NE part of Bulgaria and 5 kilometers away from the town of Provadiya. The mine is in operation since 1956. The salt is produced by dilution and extraction of the brine to the surface. A system of chambers-pillars is formed within the salt body as a result of the applied technology. The mine is situated in a seismically quiet part of the state. The region is characterized with complex geological structure and several faults. During the last 3 decades a large number of small and moderate earthquakes (M<4.5) are realized in the close vicinity of the salt deposit. Local seismological network (LSN) is deployed in the region to monitor the local seismicity. It consists of 6 three component digital stations. A real-time data transfer from LSN stations to National Data Center (in Sofia) is implemented using the VPN and MAN networks of the Bulgarian Telecommunication Company. Common processing and interpretation of the data from LSN and the national seismic network is performed. Real-time and interactive data processing are performed by the Seismic Network Data Processor (SNDP) software package. More than 700 earthquakes are registered by the LSN within 30km region around the mine during the 8 years monitoring. First we processed the data and compile a catalogue of the earthquakes occur within the studied region (30km around the salt mine). Spatial pattern of seismicity is analyzed. A large number of the seismic events occurred within the northern and north-western part of the salt body. Several earthquakes occurred in close vicinity of the mine. Concerning that the earthquakes could be tectonic and/or induced an attempt is made to find criteria to distinguish natural from induced seismicity. To characterize and distinguish the main processes active in the area we also made waveform and spectral analysis of a number of earthquakes.

  15. The effect analysis of strain rate on power transmission tower-line system under seismic excitation.

    PubMed

    Tian, Li; Wang, Wenming; Qian, Hui

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system.

  16. The Effect Analysis of Strain Rate on Power Transmission Tower-Line System under Seismic Excitation

    PubMed Central

    Wang, Wenming

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system. PMID:25105157

  17. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  18. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at ~ 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  19. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic

  20. Prioritizing health: a human rights analysis of disaster, vulnerability, and urbanization in New Orleans and Port-au-Prince.

    PubMed

    Carmalt, Jean

    2014-06-14

    Climate change prompts increased urbanization and vulnerability to natural hazards. Urbanization processes are relevant to a right to health analysis of natural hazards because they can exacerbate pre-disaster inequalities that create vulnerability. The 2010 earthquake in Port-au-Prince and the 2005 hurricane in New Orleans provide vivid illustrations of the relationship between spatial inequality and the threats associated with natural hazards. The link between urbanization processes, spatial inequality, and vulnerability to natural hazards is important in terms of an analysis of the right to health; in particular, it provides a basis for arguing that states should prioritize equitable land use and development as a matter of human rights. This article draws on work by geographers, disaster specialists, and international legal scholars to argue that inequitable urbanization processes violate the obligations to respect, protect, and fulfill the human right to health in disaster-prone regions.

  1. Analysis of the seismic performance of isolated buildings according to life-cycle cost.

    PubMed

    Dang, Yu; Han, Jian-Ping; Li, Yong-Tao

    2015-01-01

    This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment.

  2. Areal distribution of sedimentary facies determined from seismic facies analysis and models of modern depositional systems

    SciTech Connect

    Seramur, K.C.; Powell, R.D.; Carpenter, P.J.

    1988-02-01

    Seismic facies analysis was applied to 3.5-kHz single-channel analog reflection profiles of the sediment fill within Muir Inlet, Glacier Bay, southeast Alaska. Nine sedimentary facies have been interpreted from seven seismic facies identified on the profiles. The interpretations are based on reflection characteristics and structural features of the seismic facies. The following reflection characteristics and structural features are used: reflector spacing, amplitude and continuity of reflections, internal reflection configurations, attitude of reflection terminations at a facies boundary, body geometry of a facies, and the architectural associations of seismic facies within each basin. The depositional systems are reconstructed by determining the paleotopography, bedding patterns, sedimentary facies, and modes of deposition within the basin. Muir Inlet is a recently deglaciated fjord for which successive glacier terminus positions and consequent rates of glacial retreat are known. In this environment the depositional processes and sediment characteristics vary with distance from a glacier terminus, such that during a retreat a record of these variations is preserved in the aggrading sediment fill. Sedimentary facies within the basins of lower Muir Inlet are correlated with observed depositional processes near the present glacier terminus in the upper inlet. The areal distribution of sedimentary facies within the basins is interpreted using the seismic facies architecture and inferences from known sediment characteristics proximal to present glacier termini.

  3. Unique problems associated with seismic analysis of partially gas-saturated unconsolidated sediments

    USGS Publications Warehouse

    Lee, M.W.; Collett, T.S.

    2009-01-01

    Gas hydrate stability conditions restrict the occurrence of gas hydrate to unconsolidated and high water-content sediments at shallow depths. Because of these host sediments properties, seismic and well log data acquired for the detection of free gas and associated gas hydrate-bearing sediments often require nonconventional analysis. For example, a conventional method of identifying free gas using the compressional/shear-wave velocity (Vp/Vs) ratio at the logging frequency will not work, unless the free-gas saturations are more than about 40%. The P-wave velocity dispersion of partially gas-saturated sediments causes a problem in interpreting well log velocities and seismic data. Using the White, J.E. [1975. Computed seismic speeds and attenuation in rocks with partial gas saturation. Geophysics 40, 224-232] model for partially gas-saturated sediments, the difference between well log and seismic velocities can be reconciled. The inclusion of P-wave velocity dispersion in interpreting well log data is, therefore, essential to identify free gas and to tie surface seismic data to synthetic seismograms.

  4. Analysis of the Seismic Performance of Isolated Buildings according to Life-Cycle Cost

    PubMed Central

    Dang, Yu; Han, Jian-ping; Li, Yong-tao

    2015-01-01

    This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment. PMID:25653677

  5. Singular spectrum analysis and its applications in mapping mantle seismic structure

    NASA Astrophysics Data System (ADS)

    Dokht, Ramin M. H.; Gu, Yu Jeffrey; Sacchi, Mauricio D.

    2017-03-01

    Seismic discontinuities are fundamental to the understanding of mantle composition and dynamics. Their depths and impedance contrasts are generally determined using secondary phases such as SS precursors and P-to-S converted waves. However, analysing and interpreting these weak signals often suffer from incomplete data coverage, high noise levels and interfering seismic arrivals, especially near tectonically complex regions such as subduction zones. To overcome these pitfalls, we adopt a singular spectrum analysis (SSA) method to remove random noise, reconstruct missing traces and enhance the robustness of SS precursors and P-to-S conversions from mantle seismic discontinuities. Our method takes advantage of the predictability of time series in the frequency-space domain and performs rank reduction using a singular value decomposition of the trajectory matrix. We apply SSA to synthetic record sections as well as the observations of (1) SS precursors beneath the northwestern Pacific subduction zones, and (2) P-to-S converted waves from southwestern Canada. In comparison with raw or interpolated data, the SSA enhanced seismic sections exhibit greater resolution due to the suppression of random noise (which reduces signal amplitude during standard averaging procedures) through rank reduction. SSA also enables an effective separation of the SS precursors from the postcursors of S-wave core diffractions. This method will greatly benefit future analyses of weak crustal and mantle seismic phases, especially when data coverages are less than ideal.

  6. Azimuthal anisotropy analysis using P-wave multiazimuth seismic data in Rock Springs Uplift, Wyoming, US

    NASA Astrophysics Data System (ADS)

    Skelly, Klint T.

    Coal is an important source of energy, but combustion of coal releases a significant amount of carbon dioxide (CO2) into the atmosphere. Consequently, developing efficient carbon capture and sequestration strategies to mitigate global warming is of great practical significance. Characterization of reservoirs proposed for carbon capture and sequestration is important for efficient injection of CO2 and monitoring reservoir performance over time. The efficiency and long term effectiveness of CO2 storage is largely governed by the presence and orientation of fractures within a reservoir and its associated seal. The presence of natural fractures which can act as conduits for CO2 leakage gives rise to seismic anisotropy that is related to the fracture orientation and fracture density, and this relation can be studied through anisotropy analysis. Estimation of fracture orientation and fracture density is essential for long term CO 2 storage and monitoring. Well logs, cores and well tests provide information about stress fields and fractures at the well location but away from the well one has to rely on seismic data. Seismic-derived attributes like semblance and curvature provide useful tools for qualitative analysis of fractures, but they do not provide a direct measure of fracture orientation and fracture density. Moreover, such analyses depend on the quality of stacked seismic data. Multiazimuth seismic data, on the other hand, provide information about the variations in the seismic velocity in different azimuths and can thus provide a direct estimate of fracture orientation and fracture density. This research, which focus on the Rock Springs Uplift, Wyoming, USA, used single component (P-wave) multiazimuth seismic data and well data to create flattened angle gathers for different azimuths using prestack waveform inversion. Here, an advanced waveform technique, prestack waveform inversion, was used to obtain suitable velocities for proper offset-to-angle conversion as

  7. Vulnerability of Karangkates dams area by means of zero crossing analysis of data magnetic

    SciTech Connect

    Sunaryo, E-mail: sunaryo.geofis.ub@gmail.com; Susilo, Adi

    2015-04-24

    Study with entitled Vulnerability Karangkates Dam Area By Means of Zero Crossing Analysis of Data Magnetic has been done. The study was aimed to obtain information on the vulnerability of two parts area of Karangkates dams, i.e. Lahor dam which was inaugurated in 1977 and Sutami dam inaugurated in 1981. Three important things reasons for this study are: 1). The dam age was 36 years old for Lahor dam and 32 years old for Sutami dam, 2). Geologically, the location of the dams are closed together to the Pohgajih local shear fault, Selorejo local fault, and Selorejo limestone-andesite rocks contact plane, and 3). Karangkates dams is one of the important Hydro Power Plant PLTA with the generating power of about 400 million KWH per year from a total of about 29.373MW installed in Indonesia. Geographically, the magnetic data acquisition was conducted at coordinates (112.4149oE;-8.2028oS) to (112.4839oE;-8.0989oS) by using Proton Precession Magnetometer G-856. Magnetic Data acquisition was conducted in the radial direction from the dams with diameter of about 10 km and the distance between the measurements about 500m. The magnetic data acquisition obtained the distribution of total magnetic field value in the range of 45800 nT to 44450 nT. Residual anomalies obtained by doing some corrections, including diurnal correction, International Geomagnetic Reference Field (IGRF) correction, and reductions so carried out the distribution of the total magnetic field value in the range of -650 nT to 700 nT. Based on the residual anomalies, indicate the presence of 2 zones of closed closures dipole pairs at located in the west of the Sutami dam and the northwest of the Lahor dam from 5 total zones. Overlapping on the local geological map indicated the lineament of zero crossing patterns in the contour of residual anomaly contour with the Pohgajih shear fault where located at about 4 km to the west of the Sutami dam approximately and andesite-limestone rocks contact where located

  8. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  9. An enhancement of NASTRAN for the seismic analysis of structures. [nuclear power plants

    NASA Technical Reports Server (NTRS)

    Burroughs, J. W.

    1980-01-01

    New modules, bulk data cards and DMAP sequence were added to NASTRAN to aid in the seismic analysis of nuclear power plant structures. These allow input consisting of acceleration time histories and result in the generation of acceleration floor response spectra. The resulting system contains numerous user convenience features, as well as being reasonably efficient.

  10. An analysis of seismic risk from a tourism point of view.

    PubMed

    Mäntyniemi, Päivi

    2012-07-01

    Global awareness of natural calamities increased after the destructive Indian Ocean tsunami of December 2004, largely because many foreigners lost their lives, especially in Thailand. This paper explores how best to communicate the seismic risk posed by different travel destinations to crisis management personnel in tourists' home countries. The analysis of seismic risk should be straightforward enough for non-specialists, yet powerful enough to identify the travel destinations that are most at risk. The output for each location is a point in 3D space composed of the natural and built-up environment and local tourism. The tourism-specific factors can be tailored according to the tourists' nationality. The necessary information can be collected from various directories and statistics, much of it available over the Internet. The output helps to illustrate the overall seismic risk conditions of different travel destinations, allows for comparison across destinations, and identifies the places that are most at risk.

  11. Critical Soil-Structure Interaction Analysis Considerations for Seismic Qualification of Safety Equipment

    SciTech Connect

    Hossain, Q A

    2004-03-04

    While developing seismic analysis models for buildings that support safety-related equipment, a number of issues should be considered to ensure that the input motions for performing seismic qualification of safety-related equipment are properly defined. These considerations are listed and discussed here with special attention to the effect and importance of the interaction among the foundation soil, the building structure, the equipment anchors, and the equipment structure. Typical industry practices are critically examined to assess their adequacy for determining the input motions for equipment seismic qualification. The features that are considered essential in a soil-structure interaction (SSI) model are described. Also, the effects of inappropriate treatment or representation of these features are discussed.

  12. Spatial Analysis of the Vulnerability Assessment to Meteorological Hazards in Korea

    NASA Astrophysics Data System (ADS)

    Jung, J.; Lee, J.; Kim, I.; Park, K.; Shin, J.; Kim, B.

    2013-12-01

    As climate change progresses, the amount of damages and the number of casualties are expected to increase due to the increment of the number and intensity of natural disasters. Numerous governments have already tried to reduce adverse impacts of increased natural disasters by recognizing susceptible regions to disasters. However, too many variables are associated, making it difficult to find appropriate index. In this study, vulnerable areas are investigated with four main factors (demographic, socioeconomic, climatological and geographic, and technological factors). Each factor represents distinct spatial patterns and some regions present low vulnerability values (demographic factor: south-west regions, climatological and geographic factor: south-east, north-east, west coast, socioeconomic and technological factor: south-west, north-east). Especially, demographic and climatological and geographic factors represent high spatial autocorrelation, meaning a spatial imbalanced economic ability and development state. Extracted vulnerability index has strong correlation with the amount of damages per person (r = -0.607, 1-p = 0.999). Based on these results, we can assume that customized policies and strategies are highly required for each region to improve vulnerability, and extracted vulnerability index could provide a reliable representative index for assessing vulnerability. Fig.1. Spatial distribution of each 4 factor. Darker color means low vulnerability and brighter color means high vulnerability. Each region has their own values except for technological factors. Fig.2. Total vulnerability is represented in the left figure and total vulnerability except for technological factors is presented in the right side. Most of vulnerability values are between -0.5 and 0.5.

  13. Shallow prospect evaluation in Shahbazpur structure using seismic attributes analysis, Southern Bangladesh.

    NASA Astrophysics Data System (ADS)

    Rahman, M.

    2015-12-01

    Shahbazpur structure is located within the Hatia trough a southern extension of prolific Surma Basin, where lies all of the largest Gas fields of Bangladesh. A method is established to delineate the structural mapping precisely by interpreting four 2D seismic lines that are acquired over Shahbazpur structure. Moreover direct hydrocarbon indicators (DHI) related attributes analyzed for further confirmation of presence of hydrocarbon. To do this synthetic generation, seismic well tie, velocity modelling and depth conversion has been performed. Seismic attribute analysis used in this study is mostly related to bright spot identification in reservoir zones as well as to identify the similar response in both below and above of the reservoir zones. Seismic interpretation shows that Shahbazpur structure is roughly an oval shaped anticline with simple four way dip closure which will be a good trap for hydrocarbon accumulation. A limited number of seismic attributes functions that are available in an academic version of Petrel software are applied to analyze attributes. Taking in consideration of possible interpretation pitfalls, attributes analysis confirmed that bright spots exist in the shallower part of the structure above the present reservoir zones which might be a potential shallow gas reserve. The bright spots are located within Shahbazpur sequence I of Dupi Tila Group of Pleistocene age and Shahbazpur sequence II of Tipam Group of Pleistocene-Pliocene age. This signature will play a very important role in next well planning on the same structure to test the shallow accumulation of hydrocarbon. For better understanding of this shallow reserve, it is suggested to acquire 3D seismic data over Shahbazpur structure which will help to evaluate the hydrocarbon accumulation and to identify gas migration pathways.

  14. Seismic Risk Perception compared with seismic Risk Factors

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  15. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  16. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  17. Vulnerability analysis in terms of food insecurity and poverty using GIS and remote sensing technology applied to Sri Lanka

    NASA Astrophysics Data System (ADS)

    Shahriar, Pervez M.; Ramachandran, Mahadevan; Mutuwatte, Lal

    2003-03-01

    It is becoming increasingly recognized that computer methods such as models and Geographic Information Systems (GIS) can be valuable tools for analyzing a geographical area in terms of it's hazards vulnerability, Vulnerability is an important aspect of households' experience of poverty. The measurement and analysis of poverty, inequality and vulnerability are crucial for cognitive purposes (to know what the situation is), for analytical purposes (to understand the factors determining this situation), for policy making purposes (to design interventions best adapted to the issues), and for monitoring and evaluation purposes (to assess whether current policies are effective, and whether the situation is changing). Here vulnerability defined as the probability or risk today of being in poverty - or falling deeper into poverty - in the future. Vulnerability is a key dimension of well being since it affects individuals' behavior (in terms of investment, production patterns, coping strategies) and their perception of their own situation. This study has been conducted with the joint collaboration of World Food Programme (WFP) and International Water Management Institute (IWMI) in Sri Lanka for identifying regions and population which are food insecure, for analyzing the reasons for vulnerability to food insecurity in order to provide decision-makers with information to identify possible sectors of intervention and for identifying where and for whom food aid can be best utilized in Sri Lanka. This new approach integrates GIS and Remote sensing with other statistical packages to allow consideration of more spatial/physical parameters like accessibility to economic resources, particularly land and the assets of the built environment, creating employment, and attracting investment in order to improve the quality and quantity of goods and services for the analysis which leads the analysis to represent the real scenario. For this study a detailed topographic data are being used

  18. Illustrating the coupled human–environment system for vulnerability analysis: Three case studies

    PubMed Central

    Turner, B. L.; Matson, Pamela A.; McCarthy, James J.; Corell, Robert W.; Christensen, Lindsey; Eckley, Noelle; Hovelsrud-Broda, Grete K.; Kasperson, Jeanne X.; Kasperson, Roger E.; Luers, Amy; Martello, Marybeth L.; Mathiesen, Svein; Naylor, Rosamond; Polsky, Colin; Pulsipher, Alexander; Schiller, Andrew; Selin, Henrik; Tyler, Nicholas

    2003-01-01

    The vulnerability framework of the Research and Assessment Systems for Sustainability Program explicitly recognizes the coupled human–environment system and accounts for interactions in the coupling affecting the system's responses to hazards and its vulnerability. This paper illustrates the usefulness of the vulnerability framework through three case studies: the tropical southern Yucatán, the arid Yaqui Valley of northwest Mexico, and the pan-Arctic. Together, these examples illustrate the role of external forces in reshaping the systems in question and their vulnerability to environmental hazards, as well as the different capacities of stakeholders, based on their access to social and biophysical capital, to respond to the changes and hazards. The framework proves useful in directing attention to the interacting parts of the coupled system and helps identify gaps in information and understanding relevant to reducing vulnerability in the systems as a whole. PMID:12815106

  19. Illustrating the coupled human-environment system for vulnerability analysis: three case studies.

    PubMed

    Turner, B L; Matson, Pamela A; McCarthy, James J; Corell, Robert W; Christensen, Lindsey; Eckley, Noelle; Hovelsrud-Broda, Grete K; Kasperson, Jeanne X; Kasperson, Roger E; Luers, Amy; Martello, Marybeth L; Mathiesen, Svein; Naylor, Rosamond; Polsky, Colin; Pulsipher, Alexander; Schiller, Andrew; Selin, Henrik; Tyler, Nicholas

    2003-07-08

    The vulnerability framework of the Research and Assessment Systems for Sustainability Program explicitly recognizes the coupled human-environment system and accounts for interactions in the coupling affecting the system's responses to hazards and its vulnerability. This paper illustrates the usefulness of the vulnerability framework through three case studies: the tropical southern Yucatán, the arid Yaqui Valley of northwest Mexico, and the pan-Arctic. Together, these examples illustrate the role of external forces in reshaping the systems in question and their vulnerability to environmental hazards, as well as the different capacities of stakeholders, based on their access to social and biophysical capital, to respond to the changes and hazards. The framework proves useful in directing attention to the interacting parts of the coupled system and helps identify gaps in information and understanding relevant to reducing vulnerability in the systems as a whole.

  20. An Integrated Approach for Urban Earthquake Vulnerability Analyses

    NASA Astrophysics Data System (ADS)

    Düzgün, H. S.; Yücemen, M. S.; Kalaycioglu, H. S.

    2009-04-01

    -economical, structural, coastal, ground condition, organizational vulnerabilities, as well as accessibility to critical services within the framework. The proposed framework has the following eight components: Seismic hazard analysis, soil response analysis, tsunami inundation analysis, structural vulnerability analysis, socio-economic vulnerability analysis, accessibility to critical services, GIS-based integrated vulnerability assessment, and visualization of vulnerabilities in 3D virtual city model The integrated model for various vulnerabilities obtained for the urban area is developed in GIS environment by using individual vulnerability assessments for considered elements at risk and serve for establishing the backbone of the spatial decision support system. The stages followed in the model are: Determination of a common mapping unit for each aspect of urban earthquake vulnerability, formation of a geo-database for the vulnerabilities, evaluation of urban vulnerability based on multi attribute utility theory with various weighting algorithms, mapping of the evaluated integrated earthquake risk in geographic information systems (GIS) in the neighborhood scale. The framework is also applicable to larger geographical mapping scales, for example, the building scale. When illustrating the results in building scale, 3-D visualizations with remote sensing data is used so that decision-makers can easily interpret the outputs. The proposed vulnerability assessment framework is flexible and can easily be applied to urban environments at various geographical scales with different mapping units. The obtained total vulnerability maps for the urban area provide a baseline for the development of risk reduction strategies for the decision makers. Moreover, as several aspects of elements at risk for an urban area is considered through vulnerability analyses, effect on changes in vulnerability conditions on the total can easily be determined. The developed approach also enables decision makers to

  1. Seismic analysis of the large 70-meter antenna, part 1: Earthquake response spectra versus full transient analysis

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    As a check on structure safety aspects, two approaches in seismic analysis for the large 70-m antennas are presented. The first approach, commonly used by civil engineers, utilizes known recommended design response spectra. The second approach, which is the full transient analysis, is versatile and applicable not only to earthquake loading but also to other dynamic forcing functions. The results obtained at the fundamental structural frequency show that the two approaches are in good agreement with each other and both approaches show a safe design. The results also confirm past 64-m antenna seismic studies done by the Caltech Seismology Staff.

  2. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS IN SUPPORT OF INCREASED LIQUID LEVEL IN 241-AP TANK FARMS

    SciTech Connect

    MACKEY TC; ABBOTT FG; CARPENTER BG; RINKER MW

    2007-02-16

    The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

  3. Frequency Dependent Polarization Analysis of Ambient Seismic Noise Recorded at Broadband Seismometers

    NASA Astrophysics Data System (ADS)

    Koper, K.; Hawley, V.

    2010-12-01

    Analysis of ambient seismic noise is becoming increasingly relevant to modern seismology. Advances in computational speed and storage have made it feasible to analyze years and even decades of continuous seismic data in short amounts of time. Therefore, it is now possible to perform longitudinal studies of station performance in order to identify degradation or mis-installation of seismic equipment. Long-term noise analysis also provides insight into the evolution of the ocean wave climate, specifically whether the frequency and intensity of storms have changed as global temperatures have changed. Here we present a new approach to polarization analysis of seismic noise recorded by three-component seismometers. Essentially, eigen-decomposition of the 3-by-3 Hermitian spectral matrix associated with a sliding window of data is applied to yield various polarization attributes as a function of time and frequency. This in turn yields fundamental information about the composition of seismic noise, such as the extent to which it is polarized, its mode of propagation, and the direction from which it arrives at the seismometer. The polarization attributes can be viewed as function of time or binned over 2D frequency-time space to deduce regularities in the ambient noise that are unbiased by transient signals from earthquakes and explosions. We applied the algorithm to continuous data recorded in 2009 by the seismic station SLM, located in central North America. A rich variety of noise sources was observed. At low frequencies (<0.05 Hz) we observed a tilt-related signal that showed some elliptical motion in the horizontal plane. In the microseism band of 0.05-0.25 Hz, we observed Rayleigh energy arriving from the northeast, but with three distinct peaks instead of the classic single and double frequency peaks. At intermediate frequencies of 0.5-2.0 Hz, the noise was dominated by non-fundamental-mode Rayleigh energy, most likely P and Lg waves. At the highest frequencies (>3

  4. Site specific seismic hazard analysis at the DOE Kansas City Plant

    SciTech Connect

    Lynch, D.T.; Drury, M.A.; Meis, R.C.; Bieniawski, A.; Savy, J.B.; Llopis, J.L.; Constantino, C.; Hashimoto, P.S.; Campbell, K.W.

    1995-10-01

    A site specific seismic hazard analysis is being conducted for the Kansas City Plant to support an on-going structural evaluation of existing buildings. This project is part of the overall review of facilities being conducted by DOE. The seismic hazard was probabilistically defined at the theoretical rock outcrop by Lawrence Livermore National Laboratory. The USArmy Engineer Waterways Experiment Station conducted a subsurface site investigation to characterize in situ S-wave velocities and other subsurface physical properties related to the geology in the vicinity of the Main Manufacturing Building (MMB) at the Bannister Federal Complex. The test program consisted of crosshole S-wave, seismic cone penetrometer testing,and laboratory soil analyses. The information acquired from this investigation was used in a site response analysis by City College of New York to determine the earthquake motion at grade. Ground response spectra appropriate for design and evaluation of Performance Category 1 and 2 structures, systems, and components were recommended. Effects of seismic loadings on the buildings will be used to aid in designing any structural modifications.

  5. First results of an ambient seismic noise analysis in western Corinth Gulf (Greece)

    NASA Astrophysics Data System (ADS)

    Giannopoulos, Dimitrios; Paraskevopoulos, Paraskevas; Sokos, Efthimios; Tselentis, G.-Akis

    2015-04-01

    We present the preliminary results of an ambient seismic noise analysis performed in the western Corinth Gulf, Greece. The Corinth Gulf is a continental rift which separates the central Greek mainland from Peloponnese. The rift is approximately 120 km long and 10-20 km wide, with a WNW-ESE orientation, extending from the Gulf of Patras in the west, to the Gulf of Alkionides in the east. It is considered as one of the most active extensional intra-continental rifts in the world, with the geodetically measured rates of extension varying from ~5 mm/yr at the eastern part, to ~15 mm/yr at the western part. We used data from three-component broad-band seismic stations operated under the framework of the Hellenic Unified Seismological Network (HUSN) and the Corinth Rift Laboratory (CRL). After the classical processing of continuous ambient seismic noise recordings, we used both auto-correlation and cross-correlation functions of single stations and station pairs, respectively, in order to retrieve empirical Green's functions (EGFs) of surface waves and estimate relative velocity changes. For estimating the relative velocity changes we used the moving-window cross spectrum analysis (MWCS) technique. This is the first attempt to characterize the ambient seismic noise properties in the area and study the possible relation between the detected relative velocity changes and the occurrence of moderate or strong earthquakes in the study area.

  6. Seismic slope-performance analysis: from hazard map to decision support system

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.; Ho, Carlton L.

    1999-01-01

    In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.

  7. An analysis of seismic hazard in the Upper Rhine Graben enlightened by the example of the New Madrid seismic zone.

    NASA Astrophysics Data System (ADS)

    Doubre, Cécile; Masson, Frédéric; Mazzotti, Stéphane; Meghraoui, Mustapha

    2014-05-01

    Seismic hazard in the "stable" continental regions and low-level deformation zones is one of the most difficult issues to address in Earth sciences. In these zones, instrumental and historical seismicity are not well known (sparse seismic networks, seismic cycle too long to be covered by the human history, episodic seismic activity) and many active structures remain poorly characterized or unknown. This is the case of the Upper Rhine Graben, the central segment of the European Cenozoic rift system (ECRIS) of Oligocene age, which extends from the North Sea through Germany and France to the Mediterranean coast over a distance of some 1100 km. Even if this region has already experienced some destructive earthquakes, its present-day seismicity is moderate and the deformation observed by geodesy is very small (below the current measurement accuracy). The strain rate does not exceed 10-10 and paleoseismic studies indicate an average return period of 2.5 to 3 103 ka for large earthquakes. The largest earthquake known for this zone is the 1356 Basel earthquake, with a magnitude generally estimated about 6.5 (Meghraoui et al., 2001) but recently re-evaluated between 6.7 and 7.1 (Fäh et al et al., 2009). A comparison of the Upper Rhine Graben with equivalent regions around the world could help improve our evaluation of seismic hazard of this region. This is the case of the New Madrid seismic zone, one of the best studied intraplate system in central USA, which experienced an M 7.0 - 7.5 earthquake in 1811-1812 and shares several characteristics with the Upper Rhine Graben, i.e. the general framework of inherited geological structures (reactivation of a failed rift / graben), seismicity patterns (spatial variability of small and large earthquakes), the null or low rate of deformation, and the location in a "stable" continental interior. Looking at the Upper Rhine Graben as an analogue of the New Madrid seismic zone, we can re-evaluate its seismic hazard and consider the

  8. Analysis of ecological vulnerability based on landscape pattern and ecological sensitivity: a case of Duerbete County

    NASA Astrophysics Data System (ADS)

    Jiang, Miao; Gao, Wei; Chen, Xiuwan; Zhang, Xianfeng; Wei, Wenxia

    2008-08-01

    Ecological vulnerability evaluation has important real significance and scientific value. In this study, under the support of Remote Sensing and Geographic Information System, we use TM images, distribution map of sand desertification and soil salinization, and related geographic information, and adopt a combined landscape pattern and ecosystem sensitivity approach to access the ecological vulnerability of Duerbete County. We consider the following five factors to develop the model: (1) reciprocal of fractal dimension (FD'), (2) isolation (FI), (3) fragmentation (FN), (4) sensitivity of sand desertification (SD), and (5) sensitivity of soil salinization (SA). Then we build the evaluation model and calculate the vulnerability of landscape type of Duerbete. Through Kriging interpolation, we get the regional eco-environment vulnerability of whole county. Then we evaluate this cropping-pastoral interlacing region-Duerbete County. The conclusions are: (1) The vulnerability of all landscape types is in the following decreasing order: grassland > cropland > unused area > water area > construction area > wattenmeer > reed bed > woodland > paddy field; (2) There are significant positive relationships between VI and FN, VI and SD, SD and FN, SA and FN. This suggests that FN and SD have considerable impact on the eco-environmental vulnerability; (3) With the combination of FN, SD and SA, the regional eco-environment vulnerability can be evaluated well. The result is reasonable and can support ecological construction.

  9. Seismic Background Noise Analysis of Brtr (PS-43) Array

    NASA Astrophysics Data System (ADS)

    Bakir, M. E.; Meral Ozel, N.; Semin, K. U.

    2014-12-01

    The seismic background noise variation of BRTR array, composed of two sub arrays located in Ankara and in Kırıkkale-Keskin, has been investigated by calculating Power Spectral Density and Probability Density Functions for seasonal and diurnal noise variations between 2005 and 2011. PSDs were computed within the frequency range of 100 s - 10 Hz. The results show us a little change in noise conditions in terms of time and location. Especially, noise level changes were observed at 3-5 Hz in diurnal variations at Keskin array and there is a 5-7 dB difference in day and night time in cultural noise band (1-10 Hz). On the other hand, noise levels of medium period array are high in 1-2 Hz frequency range. High noise levels were observed in daily working times when we compared to night-time in cultural noise band. The seasonal background noise variation at both sites also shows very similar properties to each other. Since both arrays consist ofborehole instruments and away from the coasts, we saw a small change in noise levels caused by microseism. Comparison between Keskin short period array and Ankara medium period array show us Keskin array is quiter than Ankara array.

  10. Seismic Background Noise Analysis of BRTR (PS-43) Array

    NASA Astrophysics Data System (ADS)

    Ezgi Bakir, Mahmure; Meral Ozel, Nurcan; Umut Semin, Korhan

    2015-04-01

    The seismic background noise variation of BRTR array, composed of two sub arrays located in Ankara and in Ankara-Keskin, has been investigated by calculating Power Spectral Density and Probability Density Functions for seasonal and diurnal noise variations between 2005 and 2011. PSDs were computed within the frequency range of 100 s - 10 Hz. The results show us a little change in noise conditions in terms of time and location. Especially, noise level changes were observed at 3-5 Hz in diurnal variations at Keskin array and there is a 5-7 dB difference in day and night time in cultural noise band (1-10 Hz). On the other hand, noise levels of medium period array is high in 1-2 Hz frequency rather than short period array. High noise levels were observed in daily working times when we compare it to night-time in cultural noise band. The seasonal background noise variation at both sites also shows very similar properties to each other. Since these stations are borehole instruments and away from the coasts, we saw a small change in noise levels caused by microseism. Comparison between Keskin short period array and Ankara medium period array show us Keskin array is quiter than Ankara array.

  11. A Comparison of seismic instrument noise coherence analysis techniques

    USGS Publications Warehouse

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  12. Uncertainty analysis in vulnerability estimations for elements at risk- a review of concepts and some examples on landslides

    NASA Astrophysics Data System (ADS)

    Ciurean, R. L.; Glade, T.

    2012-04-01

    Decision under uncertainty is a constant of everyday life and an important component of risk management and governance. Recently, experts have emphasized the importance of quantifying uncertainty in all phases of landslide risk analysis. Due to its multi-dimensional and dynamic nature, (physical) vulnerability is inherently complex and the "degree of loss" estimates imprecise and to some extent even subjective. Uncertainty analysis introduces quantitative modeling approaches that allow for a more explicitly objective output, improving the risk management process as well as enhancing communication between various stakeholders for better risk governance. This study presents a review of concepts for uncertainty analysis in vulnerability of elements at risk to landslides. Different semi-quantitative and quantitative methods are compared based on their feasibility in real-world situations, hazard dependency, process stage in vulnerability assessment (i.e. input data, model, output), and applicability within an integrated landslide hazard and risk framework. The resulted observations will help to identify current gaps and future needs in vulnerability assessment, including estimation of uncertainty propagation, transferability of the methods, development of visualization tools, but also address basic questions like what is uncertainty and how uncertainty can be quantified or treated in a reliable and reproducible way.

  13. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  14. Seismic fragility assessment of RC frame structure designed according to modern Chinese code for seismic design of buildings

    NASA Astrophysics Data System (ADS)

    Wu, D.; Tesfamariam, S.; Stiemer, S. F.; Qin, D.

    2012-09-01

    Following several damaging earthquakes in China, research has been devoted to find the causes of the collapse of reinforced concrete (RC) building sand studying the vulnerability of existing buildings. The Chinese Code for Seismic Design of Buildings (CCSDB) has evolved over time, however, there is still reported earthquake induced damage of newly designed RC buildings. Thus, to investigate modern Chinese seismic design code, three low-, mid- and high-rise RC frames were designed according to the 2010 CCSDB and the corresponding vulnerability curves were derived by computing a probabilistic seismic demand model (PSDM).The PSDM was computed by carrying out nonlinear time history analysis using thirty ground motions obtained from the Pacific Earthquake Engineering Research Center. Finally, the PSDM was used to generate fragility curves for immediate occupancy, significant damage, and collapse prevention damage levels. Results of the vulnerability assessment indicate that the seismic demands on the three different frames designed according to the 2010 CCSDB meet the seismic requirements and are almost in the same safety level.

  15. Development of hazard-compatible building fragility and vulnerability models

    USGS Publications Warehouse

    Karaca, E.; Luco, N.

    2008-01-01

    We present a methodology for transforming the structural and non-structural fragility functions in HAZUS into a format that is compatible with conventional seismic hazard analysis information. The methodology makes use of the building capacity (or pushover) curves and related building parameters provided in HAZUS. Instead of the capacity spectrum method applied in HAZUS, building response is estimated by inelastic response history analysis of corresponding single-degree-of-freedom systems under a large number of earthquake records. Statistics of the building response are used with the damage state definitions from HAZUS to derive fragility models conditioned on spectral acceleration values. Using the developed fragility models for structural and nonstructural building components, with corresponding damage state loss ratios from HAZUS, we also derive building vulnerability models relating spectral acceleration to repair costs. Whereas in HAZUS the structural and nonstructural damage states are treated as if they are independent, our vulnerability models are derived assuming "complete" nonstructural damage whenever the structural damage state is complete. We show the effects of considering this dependence on the final vulnerability models. The use of spectral acceleration (at selected vibration periods) as the ground motion intensity parameter, coupled with the careful treatment of uncertainty, makes the new fragility and vulnerability models compatible with conventional seismic hazard curves and hence useful for extensions to probabilistic damage and loss assessment.

  16. Large-scale seismic signal analysis with Hadoop

    NASA Astrophysics Data System (ADS)

    Addair, T. G.; Dodge, D. A.; Walter, W. R.; Ruppert, S. D.

    2014-05-01

    In seismology, waveform cross correlation has been used for years to produce high-precision hypocenter locations and for sensitive detectors. Because correlated seismograms generally are found only at small hypocenter separation distances, correlation detectors have historically been reserved for spotlight purposes. However, many regions have been found to produce large numbers of correlated seismograms, and there is growing interest in building next-generation pipelines that employ correlation as a core part of their operation. In an effort to better understand the distribution and behavior of correlated seismic events, we have cross correlated a global dataset consisting of over 300 million seismograms. This was done using a conventional distributed cluster, and required 42 days. In anticipation of processing much larger datasets, we have re-architected the system to run as a series of MapReduce jobs on a Hadoop cluster. In doing so we achieved a factor of 19 performance increase on a test dataset. We found that fundamental algorithmic transformations were required to achieve the maximum performance increase. Whereas in the original IO-bound implementation, we went to great lengths to minimize IO, in the Hadoop implementation where IO is cheap, we were able to greatly increase the parallelism of our algorithms by performing a tiered series of very fine-grained (highly parallelizable) transformations on the data. Each of these MapReduce jobs required reading and writing large amounts of data. But, because IO is very fast, and because the fine-grained computations could be handled extremely quickly by the mappers, the net was a large performance gain.

  17. The ecological structures as components of flood and erosion vulnerability analysis in costal landscapes

    NASA Astrophysics Data System (ADS)

    Valentini, E.; Taramelli, A.; Martina, M.; Persichillo, M. G.; Casarotti, C.; Meisina, C.

    2014-12-01

    The direct and the indirect changes of natural habitats for coastal development can affect the level of exposure to erosion and flooding (inundation). Although engineered structures are still preferred for coastal safety there is an increasing number of applications of ecosystem-based solutions worldwide as the building with nature approaches and the arising natural capital evaluation. A question to which we should respond, is the possibility of using the wide range of satellite data and the already available Earth Observation based products to make a synoptic structural and environmental vulnerability assessment. By answering to this, we could also understand, if and how many markers/signals can be identified in the landscape components, to define transitions to and from nonlinear processes - to and from scale invariant spatial distributions- characterizing the evolution of the environmental patch size mosaic, the landscape. The Wadden Sea, in example, is a productive estuarine area in the south-eastern coastal zone of the North Sea. It is characterized by extensive tidal mud flats, saltmarshes and by the tidal channel network between the mainland and the chain of islands along the North Sea side. The area has a UNESCO World Heritage Status and a Natura 2000 status. Here, we identified thresholds to distinguish spatial and temporal patterns controlled by changes in environmental variables. These patterns are represented by the cover percent and by the structural level of vegetation and sediment/soil in each identified patch. The environmental variables are those able to act on the patch size distribution as the forcing factors from the sea (wind and waves fields) or from the climate and the hydrology drivers. The Bayesian approach defines the dependencies of the spatial patch size distribution from the major flooding and erosion environmental variables. When the analysis is scaled up from the ecosystem units to the landscape level thanks to the satellite

  18. Cyber Threat and Vulnerability Analysis of the U.S. Electric Sector

    SciTech Connect

    Glenn, Colleen; Sterbentz, Dane; Wright, Aaron

    2016-12-20

    With utilities in the U.S. and around the world increasingly moving toward smart grid technology and other upgrades with inherent cyber vulnerabilities, correlative threats from malicious cyber attacks on the North American electric grid continue to grow in frequency and sophistication. The potential for malicious actors to access and adversely affect physical electricity assets of U.S. electricity generation, transmission, or distribution systems via cyber means is a primary concern for utilities contributing to the bulk electric system. This paper seeks to illustrate the current cyber-physical landscape of the U.S. electric sector in the context of its vulnerabilities to cyber attacks, the likelihood of cyber attacks, and the impacts cyber events and threat actors can achieve on the power grid. In addition, this paper highlights utility perspectives, perceived challenges, and requests for assistance in addressing cyber threats to the electric sector. There have been no reported targeted cyber attacks carried out against utilities in the U.S. that have resulted in permanent or long term damage to power system operations thus far, yet electric utilities throughout the U.S. have seen a steady rise in cyber and physical security related events that continue to raise concern. Asset owners and operators understand that the effects of a coordinated cyber and physical attack on a utility’s operations would threaten electric system reliability–and potentially result in large scale power outages. Utilities are routinely faced with new challenges for dealing with these cyber threats to the grid and consequently maintain a set of best practices to keep systems secure and up to date. Among the greatest challenges is a lack of knowledge or strategy to mitigate new risks that emerge as a result of an exponential rise in complexity of modern control systems. This paper compiles an open-source analysis of cyber threats and risks to the electric grid, utility best practices

  19. Automatic Detection and Vulnerability Analysis of Areas Endangered by Heavy Rain

    NASA Astrophysics Data System (ADS)

    Krauß, Thomas; Fischer, Peter

    2016-08-01

    In this paper we present a new method for fully automatic detection and derivation of areas endangered by heavy rainfall based only on digital elevation models. Tracking news show that the majority of occuring natural hazards are flood events. So already many flood prediction systems were developed. But most of these existing systems for deriving areas endangered by flooding events are based only on horizontal and vertical distances to existing rivers and lakes. Typically such systems take not into account dangers arising directly from heavy rain events. In a study conducted by us together with a german insurance company a new approach for detection of areas endangered by heavy rain was proven to give a high correlation of the derived endangered areas and the losses claimed at the insurance company. Here we describe three methods for classification of digital terrain models and analyze their usability for automatic detection and vulnerability analysis for areas endangered by heavy rainfall and analyze the results using the available insurance data.

  20. Analysis of Seismic Activity of the last 15 Years Nearby Puerto Rico and Caribbean Region.

    NASA Astrophysics Data System (ADS)

    Huerta-Lopez, C. I.; Torres-Ortíz, D. M.; Fernández-Heredia, A. I.; Martínez-Cruzado, J. A.

    2015-12-01

    An earthquake catalog of the seismicity occurred during the last 15 years in the Caribbean region, nearby the vicinity of Puerto Rico Island (PRI) was compiled in order to capture the big picture of the regional seismic activity ratio and in particular at the epicentral regions of several historical and instrumentally recorded (during 2008-20015) large to moderate magnitude earthquakes occurred nearby PRI in onshore and offshore, which include the M6.4 earthquake of 01/13/2014, the largest earthquake recorded instrumentally nearby PRI. From the point of view of joint temporal-spatial distribution of epicenters, episodic temporal-spatial seismic activity is clearly seen as temporal-spatial concentrations during certain time intervals in different regions. These localized concentrations of epicenters that occur during certain time intervals in well localized/concentrated regions may suggest "seismic gaps" that shows no regular time interval, neither spatial pattern. In the epicentral region of the M6.4 01/13/2014 earthquake and the historical Mona Passage M7.5 earthquake of 10/11/1918, episodic concentrations in time and space of small magnitude earthquakes epicenters is evident, however do not show temporal pattern. Preliminary results of statistical analysis of an ongoing research in terms of the parameter b (Gutenberg-Richter relationship), and the Omori's law with the aim to relate the tectonic framework of the region (or sub-regions) such as structural heterogeneity stress are here presented/discussed.

  1. Areal distribution of sedimentary facies determined from seismic facies analysis and models of modern depositional systems

    SciTech Connect

    Seramur, K.C.; Powell, R.D.; Carpenter, P.J.

    1988-01-01

    Seismic facies analysis was applied to 3.5-kHz single-channel analog reflection profiles of the sediment fill within Muir Inlet, Glacier Bay, southeast Alaska. Nine sedimentary facies have been interpreted from seven seismic facies identified on the profiles. The interpretations are based on reflection characteristics and structural features of the seismic facies. The following reflection characteristics and structural features are used: reflector spacing, amplitude and continuity of reflections, internal reflection configurations, attitude of reflection terminations at a facies boundary, body geometry of a facies, and the architectural associations of seismic facies within each basin. The depositional systems are reconstructed by determining the paleotopography, bedding patterns, sedimentary facies, and modes of deposition within the basin. Muir Inlet is a recently deglaciated fjord for which successive glacier terminus positions and consequent rates of glacial retreat are known. In this environment the depositional processes and sediment characteristics vary with distance from a glacier terminus, such that during a retreat a record of these variations is preserved in the aggrading sediment fill. Sedimentary facies within the basins of lower Muir Inlet are correlated with observed depositional processes near the present glacier terminus in the upper inlet.

  2. Nonlinear Seismic Correlation Analysis of the JNES/NUPEC Large-Scale Piping System Tests.

    SciTech Connect

    Nie,J.; DeGrassi, G.; Hofmayer, C.; Ali, S.

    2008-06-01

    The Japan Nuclear Energy Safety Organization/Nuclear Power Engineering Corporation (JNES/NUPEC) large-scale piping test program has provided valuable new test data on high level seismic elasto-plastic behavior and failure modes for typical nuclear power plant piping systems. The component and piping system tests demonstrated the strain ratcheting behavior that is expected to occur when a pressurized pipe is subjected to cyclic seismic loading. Under a collaboration agreement between the US and Japan on seismic issues, the US Nuclear Regulatory Commission (NRC)/Brookhaven National Laboratory (BNL) performed a correlation analysis of the large-scale piping system tests using derailed state-of-the-art nonlinear finite element models. Techniques are introduced to develop material models that can closely match the test data. The shaking table motions are examined. The analytical results are assessed in terms of the overall system responses and the strain ratcheting behavior at an elbow. The paper concludes with the insights about the accuracy of the analytical methods for use in performance assessments of highly nonlinear piping systems under large seismic motions.

  3. Analysis of the October-November 2010 seismic swarm in the Sampeyre area (Piedmont, Italy)

    NASA Astrophysics Data System (ADS)

    Barani, S.; Spallarossa, D.; Scafidi, D.; Ferretti, G.; De Ferrari, R.; Pasta, M.

    2012-04-01

    possible correlations or similarities with previous earthquake activity, we analyzed the seismic history of the last 30 years. It reveals that the investigated area never experienced events comparable to that under study. Only in 1989, an intense aftershock sequence, but of shorter duration, took place in the Sampeyre area (approximately a tenth of events were recorded in two days). The strongest instrumental earthquakes, which occurred in January 1994 (Melle earthquake) and April 1998 (Oncino Earthquake) with magnitude 4.3 and 4.1, were neither preceded nor followed by intense activity. Concerning historical seismicity, the area shows a generally infrequent activity characterized by low-magnitude events. The major shocks felt in Sampeyre were the 1905 Alta Savoia (Io = VIII-VIII MCS) and the 1914 Tavernette (Io = VIII MCS) earthquakes but they did not produce any damage. Previous observation points out the uniqueness of the 2010 swarm which, therefore, deserves special attention. Following a general description of the swarm evolution and the performances of the RSNI network, the study will present the results of a waveform similarity analysis, aimed to indentify families of earthquakes belonging to common genetic sources. Results of a strain rate analysis will be also discussed, focusing on the release of seismic energy with time. Finally, the analysis of the micro-seismicity detected through an STA/LTA- (short-time average/long-time average) based algorithm is presented, revealing a b-value of around unity. The similarity between this value and that calculated accounting for the regional seismicity is a hints that the causative process of micro-earthquakes is of the same nature as that generating larger events.

  4. Active seismic experiment

    NASA Technical Reports Server (NTRS)

    Kovach, R. L.; Watkins, J. S.; Talwani, P.

    1972-01-01

    The Apollo 16 active seismic experiment (ASE) was designed to generate and monitor seismic waves for the study of the lunar near-surface structure. Several seismic energy sources are used: an astronaut-activated thumper device, a mortar package that contains rocket-launched grenades, and the impulse produced by the lunar module ascent. Analysis of some seismic signals recorded by the ASE has provided data concerning the near-surface structure at the Descartes landing site. Two compressional seismic velocities have so far been recognized in the seismic data. The deployment of the ASE is described, and the significant results obtained are discussed.

  5. Study on seismic performance enhancement in bridges based on factorial analysis

    NASA Astrophysics Data System (ADS)

    Abey, E. Thomas; Somasundaran, T. P.; Sajith, A. S.

    2017-01-01

    The seismic performance of bridges depends on the ductile behavior of its column, as the deck and other substructural components except pile foundations are normally designed to be elastic to facilitate bridge retrofitting. Codes such as AASHTO, Caltrans, IRC: 112 etc. give guidelines for the seismic performance enhancement of columns through ductile detailing. In the present study, a methodology for the seismic performance enhancement of bridges is discussed by using a "Parameter-Based Influence Factor" (PIF) developed from factorial analysis. The parameters considered in the factorial analysis are: percentage of longitudinal reinforcement ( P t), compressive strength of concrete ( f'c), yield strength of steel ( f y), spacing of lateral ties ( S) and column height ( H). The influence of each parameter and their combination on the limit states considered is estimated. Pushover analysis is used to evaluate the capacity of columns, considering shear failure criteria. A total of 243 (35 combinations) analysis results are compiled to develop `PIF' used in the performance enhancement process. The study also encompasses other sub-objectives such as evaluating the discrepancies in using the Importance Factor ( I) in designing bridges of varied functional importance; and estimating the aspect ratio and slenderness ratio values of bridge columns for its initial sizing.

  6. MSAT: a New Matlab Toolbox for the Analysis and Modelling of Seismic Anisotropy

    NASA Astrophysics Data System (ADS)

    Walker, A. M.; Wookey, J.

    2011-12-01

    Studies of seismic anisotropy rarely end with measurements of shear-wave splitting - instead an explanation of the physical origin of the anisotropy is sought in order to yield useful geological or geophysical information. We describe a new Matlab toolbox designed to aid the modelling needed for this interpretative step of the analysis of seismic anisotropy. Provision of key building blocks for modelling in this modern integrated development environment allows the rapid development and prototyping of explanations for measured anisotropy. The Matlab graphical environment also permits plotting of key anisotropic parameters. Furthermore, this work complements the SplitLab toolbox used for measuring shear wave splitting and the MTEX toolbox used for the analysis of textures in rocks. MSAT (the Matlab Seismic Anisotropy Toolbox) includes a wide range of functions which can be used to rapidly build models of seismic anisotropy. Available functions include: the determination of phase velocities as a function of wave propagation direction, the analysis of multi-layer splitting, a novel interpolation scheme for elastic constants tensors, the estimation of the anisotropy caused by the presence of aligned inclusions and the measurement of the degree of anisotropy exhibited by an elastic material. We include a database of elastic properties of rocks and minerals and functions to plot seismic anisotropy as a function of wave propagation direction in the form of pole figures or as three-dimensional plots. The toolbox includes extensive documentation and example applications which integrate with the Matlab documentation system alongside automated test cases for all functions. All code is open source and available freely to all. We encourage users to feed back any changes they may need to make. Key examples of the use of this software include: (1) Calculation of the pattern of backazimuthal variation of shear wave splitting caused by the interaction of two dipping layers of

  7. Extended defense systems :I. adversary-defender modeling grammar for vulnerability analysis and threat assessment.

    SciTech Connect

    Merkle, Peter Benedict

    2006-03-01

    Vulnerability analysis and threat assessment require systematic treatments of adversary and defender characteristics. This work addresses the need for a formal grammar for the modeling and analysis of adversary and defender engagements of interest to the National Nuclear Security Administration (NNSA). Analytical methods treating both linguistic and numerical information should ensure that neither aspect has disproportionate influence on assessment outcomes. The adversary-defender modeling (ADM) grammar employs classical set theory and notation. It is designed to incorporate contributions from subject matter experts in all relevant disciplines, without bias. The Attack Scenario Space U{sub S} is the set universe of all scenarios possible under physical laws. An attack scenario is a postulated event consisting of the active engagement of at least one adversary with at least one defended target. Target Information Space I{sub S} is the universe of information about targets and defenders. Adversary and defender groups are described by their respective Character super-sets, (A){sub P} and (D){sub F}. Each super-set contains six elements: Objectives, Knowledge, Veracity, Plans, Resources, and Skills. The Objectives are the desired end-state outcomes. Knowledge is comprised of empirical and theoretical a priori knowledge and emergent knowledge (learned during an attack), while Veracity is the correspondence of Knowledge with fact or outcome. Plans are ordered activity-task sequences (tuples) with logical contingencies. Resources are the a priori and opportunistic physical assets and intangible attributes applied to the execution of associated Plans elements. Skills for both adversary and defender include the assumed general and task competencies for the associated plan set, the realized value of competence in execution or exercise, and the opponent's planning assumption of the task competence.

  8. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  9. The Singular Spectrum Analysis method and its application to seismic data denoising and reconstruction

    NASA Astrophysics Data System (ADS)

    Oropeza, Vicente E.

    Attenuating random and coherent noise is an important part of seismic data processing. Successful removal results in an enhanced image of the subsurface geology, which facilitate economical decisions in hydrocarbon exploration. This motivates the search for new and more efficient techniques for noise removal. The main goal of this thesis is to present an overview of the Singular Spectrum Analysis (SSA) technique, studying its potential application to seismic data processing. An overview of the application of SSA for time series analysis is presented. Subsequently, its applications for random and coherenet noise attenuation, expansion to multiple dimensions, and for the recovery of unrecorded seismograms are described. To improve the performance of SSA, a faster implementation via a randomized singular value decomposition is proposed. Results obtained in this work show that SSA is a versatile method for both random and coherent noise attenuation, as well as for the recovery of missing traces.

  10. Probabilistic seismic hazard analysis for offshore structures in the Santa Barbara Channel phase 2 report

    SciTech Connect

    Foxall, W; Savy, J

    1999-08-06

    This report summarizes progress through Phase 2 of the probabilistic seismic hazards analysis (PSHA) for the Santa Barbara Channel being carried out by the Lawrence Livermore National Laboratory (LLNL) for the Minerals Management Service (MMS) of the US Department of the Interior. The purpose of the PSHA is to provide a basis for development by MMS of regulations governing evaluation of applications to re-license existing oil platforms in federal waters within the Channel with respect to seismic loading. The final product of the analysis will be hazard maps of ground motion parameters at specified probability levels of exceedence. This report summarizes the characterization of local earthquake sources within the Channel and onshore areas of the Western Transverse Ranges, development of a ground motion attenuation model for the region, and presents preliminary hazard results at three selected sites.

  11. Systematic re-analysis of 23 years of volcanic seismicity on Hawaii Island

    NASA Astrophysics Data System (ADS)

    Matoza, R. S.; Shearer, P. M.; Okubo, P.

    2014-12-01

    The analysis and interpretation of seismicity from mantle depths to the surface plays a key role in understanding how volcanoes work. We are developing and applying methods for the systematic reanalysis of waveforms from volcano-seismic networks, including high-precision earthquake relocation, spectral event classification, and robust focal mechanism and stress drop estimates. Our primary dataset is the ~50-station permanent network of the USGS Hawaiian Volcano Observatory (HVO), but we are extending our methods for application to other volcanic systems. We have converted the entire HVO digital waveform and phase-pick database from 1986 to 2009 (~260,0000 events) to a uniform custom event format, greatly facilitating systematic analyses. A comprehensive multi-year catalog of high-precision relocated seismicity for all of Hawaii Island exhibits a dramatic sharpening of earthquake clustering along faults, streaks, and magmatic features, permitting a more detailed understanding of fault geometries and volcanic and tectonic processes. Automated spectral identification and relocation of long-period (LP, 0.5-5 Hz) seismicity near the summit region of Kilauea Volcano shows that most intermediate depth (5-15 km) LP events occur within a compact volume that has remained at a fixed location for over 23 years. An unanticipated result from our relocation work is the emergence of sharp ring seismicity features. We have so far identified 2 ring features: a full ring of diameter ~2 km on the northwest flank of Mauna Loa, and a half-ring feature of diameter ~0.5 km near Makaopuhi Crater. We are also performing comprehensive spectral analyses to estimate spatial variations in stress drop of shear-failure earthquakes.

  12. Stability analysis of the Ischia Mt. Nuovo block, Italy, under extreme seismic shaking

    NASA Astrophysics Data System (ADS)

    Ausilia Paparo, Maria; Tinti, Stefano

    2016-04-01

    In this work we investigate the equilibrium conditions of the Mt. Nuovo block, a unit that is found on the northwestern flank of Mt. Epomeo in the Ischia Island, Italy, using the Minimum Lithostatic Deviation Method (Tinti and Manucci 2006, 2008; Paparo et al. 2013). The block, involved in a deep-seated gravitational slope deformation (DSGSD, Della Seta et al., 2012) process, forms an interesting scenario to study earthquake-induced instability because i) Ischia is a seismically active volcanic island; ii) the slopes of Mt. Epomeo are susceptible to mass movements; iii) there exist an abundant literature on historical local seismicity and on slope geology. In our slope stability analysis, we account for seismic load by means of peak ground acceleration (PGA) values taken from Italian seismic hazard maps (Gruppo di Lavoro MPS, 2004), and integrated with estimates based on local seismicity and suitable (MCS) I - PGA regression laws. We find that the Mt Nuovo block could not be destabilised by the 1883 Casamicciola earthquake (that is the largest known historical earthquake in the island taking place on a fault to the north of the block), but we find also that if an earthquake of the same size occurred in the Mt. Nuovo zone, the block would be mobilised and therefore generate a tsunami (Zaniboni et al, 2013), with disastrous consequences not only for Ischia, but also for the surrounding region. This work was carried out in the frame of the EU Project called ASTARTE - Assessment, STrategy And Risk Reduction for Tsunamis in Europe (Grant 603839, 7th FP, ENV.2013.6.4-3)

  13. An Earthquake Source Ontology for Seismic Hazard Analysis and Ground Motion Simulation

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Jordan, T. H.; Gil, Y.; Ratnakar, V.

    2005-12-01

    Representation of the earthquake source is an important element in seismic hazard analysis and earthquake simulations. Source models span a range of conceptual complexity - from simple time-independent point sources to extended fault slip distributions. Further computational complexity arises because the seismological community has established so many source description formats and variations thereof; what this means is that conceptually equivalent source models are often expressed in different ways. Despite the resultant practical difficulties, there exists a rich semantic vocabulary for working with earthquake sources. For these reasons, we feel it is appropriate to create a semantic model of earthquake sources using an ontology, a computer science tool from the field of knowledge representation. Unlike the domain of most ontology work to date, earthquake sources can be described by a very precise mathematical framework. Another uniqueness associated with developing such an ontology is that earthquake sources are often used as computational objects. A seismologist generally wants more than to simply construct a source and have it be well-formed and properly described; additionally, the source will be used for performing calculations. Representation and manipulation of complex mathematical objects presents a challenge to the ontology development community. In order to enable simulations involving many different types of source models, we have completed preliminary development of a seismic point source ontology. The use of an ontology to represent knowledge provides machine interpretability and the ability to validate logical consistency and completeness. Our ontology, encoded using the OWL Web Ontology Language - a standard from the World Wide Web Consortium, contains the conceptual definitions and relationships necessary for source translation services. For example, specification of strike, dip, rake, and seismic moment will automatically translate into a double

  14. Analysis and models of pre-injection surface seismic array noise recorded at the Aquistore carbon storage site

    NASA Astrophysics Data System (ADS)

    Birnie, Claire; Chambers, Kit; Angus, Doug; Stork, Anna L.

    2016-08-01

    Noise is a persistent feature in seismic data and so poses challenges in extracting increased accuracy in seismic images and physical interpretation of the subsurface. In this paper, we analyse passive seismic data from the Aquistore carbon capture and storage pilot project permanent seismic array to characterise, classify and model seismic noise. We perform noise analysis for a three-month subset of passive seismic data from the array and provide conclusive evidence that the noise field is not white, stationary, or Gaussian; characteristics commonly yet erroneously assumed in most conventional noise models. We introduce a novel noise modelling method that provides a significantly more accurate characterisation of real seismic noise compared to conventional methods, which is quantified using the Mann-Whitney-White statistical test. This method is based on a statistical covariance modelling approach created through the modelling of individual noise signals. The identification of individual noise signals, broadly classified as stationary, pseudo-stationary and non-stationary, provides a basis on which to build an appropriate spatial and temporal noise field model. Furthermore, we have developed a workflow to incorporate realistic noise models within synthetic seismic data sets providing an opportunity to test and analyse detection and imaging algorithms under realistic noise conditions.

  15. Using Fuzzy Analytic Hierarchy Process multicriteria and Geographical information system for coastal vulnerability analysis in Morocco: The case of Mohammedia

    NASA Astrophysics Data System (ADS)

    Tahri, Meryem; Maanan, Mohamed; Hakdaoui, Mustapha

    2016-04-01

    This paper shows a method to assess the vulnerability of coastal risks such as coastal erosion or submarine applying Fuzzy Analytic Hierarchy Process (FAHP) and spatial analysis techniques with Geographic Information System (GIS). The coast of the Mohammedia located in Morocco was chosen as the study site to implement and validate the proposed framework by applying a GIS-FAHP based methodology. The coastal risk vulnerability mapping follows multi-parametric causative factors as sea level rise, significant wave height, tidal range, coastal erosion, elevation, geomorphology and distance to an urban area. The Fuzzy Analytic Hierarchy Process methodology enables the calculation of corresponding criteria weights. The result shows that the coastline of the Mohammedia is characterized by a moderate, high and very high level of vulnerability to coastal risk. The high vulnerability areas are situated in the east at Monika and Sablette beaches. This technical approach is based on the efficiency of the Geographic Information System tool based on Fuzzy Analytical Hierarchy Process to help decision maker to find optimal strategies to minimize coastal risks.

  16. A Vulnerability-Benefit Analysis of Fossil Fuel CO2 Emissions

    NASA Astrophysics Data System (ADS)

    Delman, E. M.; Stephenson, S. R.; Davis, S. J.; Diffenbaugh, N. S.

    2015-12-01

    Although we can anticipate continued improvements in our understanding of future climate impacts, the central challenge of climate change is not scientific, but rather political and economic. In particular, international climate negotiations center on how to share the burden of uncertain mitigation and adaptation costs. We expose the relative economic interests of different countries by assessing and comparing their vulnerability to climate impacts and the economic benefits they derive from the fossil fuel-based energy system. Vulnerability refers to the propensity of humans and their assets to suffer when impacted by hazards, and we draw upon the results from a number of prior studies that have quantified vulnerability using multivariate indices. As a proxy for benefit, we average CO2 related to each country's extraction of fossil fuels, production of CO2 emissions, and consumption of goods and services (Davis et al., 2011), which should reflect benefits accrued in proportion to national economic dependence on fossil fuels. We define a nondimensional vulnerability-benefit ratio for each nation and find a large range across countries. In general, we confirm that developed and emerging economies such as the U.S., Western Europe, and China rely heavily on fossil fuels and have substantial resources to respond to the impacts of climate change, while smaller, less-developed economies such as Sierra Leone and Vanuatu benefit little from current CO2 emissions and are much more vulnerable to adverse climate impacts. In addition, we identify some countries with a high vulnerability and benefit, such as Iraq and Nigeria; conversely, some nations exhibit both a low vulnerability and benefit, such as New Zealand. In most cases, the ratios reflect the nature of energy-climate policies in each country, although certain nations - such as the United Kingdom and France - assume a level of responsibility incongruous with their ratio and commit to mitigation policy despite

  17. A parametric study of nonlinear seismic response analysis of transmission line structures.

    PubMed

    Tian, Li; Wang, Yanming; Yi, Zhenhua; Qian, Hui

    2014-01-01

    A parametric study of nonlinear seismic response analysis of transmission line structures subjected to earthquake loading is studied in this paper. The transmission lines are modeled by cable element which accounts for the nonlinearity of the cable based on a real project. Nonuniform ground motions are generated using a stochastic approach based on random vibration analysis. The effects of multicomponent ground motions, correlations among multicomponent ground motions, wave travel, coherency loss, and local site on the responses of the cables are investigated using nonlinear time history analysis method, respectively. The results show the multicomponent seismic excitations should be considered, but the correlations among multicomponent ground motions could be neglected. The wave passage effect has a significant influence on the responses of the cables. The change of the degree of coherency loss has little influence on the response of the cables, but the responses of the cables are affected significantly by the effect of coherency loss. The responses of the cables change little with the degree of the difference of site condition changing. The effect of multicomponent ground motions, wave passage, coherency loss, and local site should be considered for the seismic design of the transmission line structures.

  18. Dynamics of the Oso-Steelhead landslide from broadband seismic analysis

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Stark, C. P.; Ekström, G.

    2015-06-01

    We carry out a combined analysis of the short- and long-period seismic signals generated by the devastating Oso-Steelhead landslide that occurred on 22 March 2014. The seismic records show that the Oso-Steelhead landslide was not a single slope failure, but a succession of multiple failures distinguished by two major collapses that occurred approximately 3 min apart. The first generated long-period surface waves that were recorded at several proximal stations. We invert these long-period signals for the forces acting at the source, and obtain estimates of the first failure runout and kinematics, as well as its mass after calibration against the mass-centre displacement estimated from remote-sensing imagery. Short-period analysis of both events suggests that the source dynamics of the second event is more complex than the first. No distinct long-period surface waves were recorded for the second failure, which prevents inversion for its source parameters. However, by comparing the seismic energy of the short-period waves generated by both events we are able to estimate the volume of the second. Our analysis suggests that the volume of the second failure is about 15-30% of the total landslide volume, giving a total volume mobilized by the two events between 7 × 106 and 10 × 106 m3, in agreement with estimates from ground observations and lidar mapping.

  19. A Parametric Study of Nonlinear Seismic Response Analysis of Transmission Line Structures

    PubMed Central

    Wang, Yanming; Yi, Zhenhua

    2014-01-01

    A parametric study of nonlinear seismic response analysis of transmission line structures subjected to earthquake loading is studied in this paper. The transmission lines are modeled by cable element which accounts for the nonlinearity of the cable based on a real project. Nonuniform ground motions are generated using a stochastic approach based on random vibration analysis. The effects of multicomponent ground motions, correlations among multicomponent ground motions, wave travel, coherency loss, and local site on the responses of the cables are investigated using nonlinear time history analysis method, respectively. The results show the multicomponent seismic excitations should be considered, but the correlations among multicomponent ground motions could be neglected. The wave passage effect has a significant influence on the responses of the cables. The change of the degree of coherency loss has little influence on the response of the cables, but the responses of the cables are affected significantly by the effect of coherency loss. The responses of the cables change little with the degree of the difference of site condition changing. The effect of multicomponent ground motions, wave passage, coherency loss, and local site should be considered for the seismic design of the transmission line structures. PMID:25133215

  20. Outgoing longwave radiation anomalies analysis associated with different types of seismic activity

    NASA Astrophysics Data System (ADS)

    Xiong, Pan; Shen, Xuhui

    2017-03-01

    The paper has developed and proposed a statistical analysis method based on the Robust Satellite data analysis technique to detect seismic anomalies within the NOAA OLR dataset based on spatial/temporal continuity analysis. The proposed methods has been applied to statistical analyze about 3376 earthquake cases from September 01, 2007 to May 23, 2015. For statistical purposes, all these events have been divided into different types on the basis of the seismic parameters, including Southern or Northern Hemisphere earthquakes, earthquakes at different magnitude levels, earthquakes at different depth levels. The results show that the intensity of the anomalies increased with the magnitude increasing; anomalies are more easily observed during shallow earthquakes than the deep ones; more obvious anomalies could be detected for the earthquakes occurring in the Northern Hemisphere and the anomalies significant increases near the epicenter one day before and on the day of the earthquake. A similar anomaly shows that there are anomalies near the epicenters before earthquakes and the anomalies have some relation with the earthquake preparation on all seismic activity. All these statistical results can help create a better understanding of the preparation process of the earthquakes.

  1. Analysis of Cardiovascular Tissue Components for the Diagnosis of Coronary Vulnerable Plaque from Intravascular Ultrasound Images

    PubMed Central

    Hwang, Yoo Na; Kim, Ga Young; Shin, Eun Seok

    2017-01-01

    The purpose of this study was to characterize cardiovascular tissue components and analyze the different tissue properties for predicting coronary vulnerable plaque from intravascular ultrasound (IVUS) images. For this purpose, sequential IVUS image frames were obtained from human coronary arteries using 20 MHz catheters. The plaque regions between the intima and media-adventitial borders were manually segmented in all IVUS images. Tissue components of the plaque regions were classified into having fibrous tissue (FT), fibrofatty tissue (FFT), necrotic core (NC), or dense calcium (DC). The media area and lumen diameter were also estimated simultaneously. In addition, the external elastic membrane (EEM) was computed to predict the vulnerable plaque after the tissue characterization. The reliability of manual segmentation was validated in terms of inter- and intraobserver agreements. The quantitative results found that the FT and the media as well as the NC would be good indicators for predicting vulnerable plaques in IVUS images. In addition, the lumen was not suitable for early diagnosis of vulnerable plaque because of the low significance compared to the other vessel parameters. To predict vulnerable plaque rupture, future study should have additional experiments using various tissue components, such as the EEM, FT, NC, and media.

  2. Structural Analysis Results of Thermal, Operating and Seismic Analysis for Hanford Single-Shell Tank Integrity - 12261

    SciTech Connect

    Pilli, Siva P.; Rinker, Michael W.

    2012-07-01

    Since Hanford's 149 Single-Shell Tanks (SSTs) are well beyond their design life, the U.S. Department of Energy has commissioned a state of the art engineering analysis to assess the structural integrity of the tanks to ensure that they are fit for service during the cleanup and closure phase. The structural integrity analysis has several challenging factors. There are four different tank sizes in various configurations that require analysis. Within each tank type there are different waste level and temperature histories, soil overburden depths, tank floor arrangements, riser sizes and locations, and other on-tank structures that need to be addressed. Furthermore, soil properties vary throughout the tank farms. This paper describes the structural integrity analysis that was performed for the SSTs using finite element models that incorporate the detailed design features of each tank type. The analysis was performed with two different models: an ANSYS static model for the Thermal and Operating Loads Analysis, and an ANSYS dynamic model for the seismic analysis. The TOLA analyses simulate the waste level and thermal history and it included a matrix of analysis cases that bounded the material property uncertainties. The TOLA also predicts the occurrence of concrete thermal degradations and cracking, reinforcement yielding, and soil plasticity. The seismic analysis matrix included uncertainty in waste properties, waste height and the soil modulus. In seismic analysis the tank concrete was modeled as a linear elastic material that was adjusted for the present day degraded conditions. Also, the soil was treated as a linear elastic material while special modeling techniques were used to avoid soil arching and achieve proper soil pressure on the tank walls. Seismic time histories in both the horizontal and vertical directions were applied to the seismic model. Structural demands from both Thermal and Operating Loads Analysis and seismic models were extracted in the form of

  3. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT DYTRAN ANALYSIS OF SEISMICALLY INDUCED FLUID STRUCTURE INTERACTION IN A HANFORD DOUBLE SHELL PRIMARY TANK

    SciTech Connect

    MACKEY, T.C.

    2006-03-14

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project-DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The overall seismic analysis of the DSTs is being performed with the general-purpose finite element code ANSYS'. The global model used for the seismic analysis of the DSTs includes the DST structure, the contained waste, and the surrounding soil. The seismic analysis of the DSTs must address the fluid-structure interaction behavior and sloshing response of the primary tank and contained liquid. ANSYS has demonstrated capabilities for structural analysis, but has more limited capabilities for fluid-structure interaction analysis. The purpose of this study is to demonstrate the capabilities and investigate the limitations of the finite element code MSC.Dytranz for performing a dynamic fluid-structure interaction analysis of the primary tank and contained waste. To this end, the Dytran solutions are benchmarked against theoretical solutions appearing in BNL 1995, when such theoretical solutions exist. When theoretical solutions were not available, comparisons were made to theoretical solutions to similar problems, and to the results from ANSYS simulations. Both rigid tank and flexible tank configurations were analyzed with Dytran. The response parameters of interest that are evaluated in this study are the total hydrodynamic reaction forces, the impulsive and convective mode frequencies, the waste pressures, and slosh heights

  4. Cluster analysis of Landslide Vulnerable region on an urban Area in South Korea

    NASA Astrophysics Data System (ADS)

    Moon, Yonghee; Lee, Sangeun; Kim, Myoungsoo; Baek, Jongrak

    2016-04-01

    Mountain areas occupy about 65% of the territory in South Korea. Due to the rapid population growth and urbanization, many cities suffer from the limitation in space, and hence the commercial buildings, educational facilities, and housing settlement areas continue to stretch until the bottom of the mountain. In result, residents become more and more vulnerable to landslides and debris flow. This led to the central government to perceiving the need for strengthening regulations relevant to urban planning. In order to consider risks due to landslides and debris flow in the stage of urban planning, present authors suggested the strategies, including: first, selecting priority areas necessary to manage landslide-related disasters strictly; second, establishing the integrated management system useful to offer technical assistances to persons in charge of urban planning in the areas; third, promoting disaster awareness programs with those persons along with the central government. As the first attempt, this study mainly discusses the GIS-application procedures in which authors selected the priority areas, which are summarized: 1. Collect the landslide historical data for the period 1999 - 2012 when the disasters particularly threatened the whole country. 2. Define the areas with the one-kilometer radius around the landslide occurrence places. 3. Exclude the areas where population is less than 100 persons per 1 km2. 4. Exclude the areas where mountains with Grade I or II of landslide risk (announced by the Korea Forest Service) go below a certain portion of the area. 5. Carry out the cluster analysis with the remaining areas 6. Classify the types at the standpoint of landslide disaster risk management. Through the procedures, this study obtained a total of 86 priority areas, which were also classified into 24 areas - Type A (high population exposure and mid landslide occurrence likelihood) -, 25 areas - Type B (mid population exposure and high landslide occurrence

  5. Quantifying Distribution of Recent Sediment Using XRF Analysis and Seismic Data in the Hudson River Estuary

    NASA Astrophysics Data System (ADS)

    Haberman, M.; Nitsche, F. O.; Kenna, T. C.; Sands, E.; Bell, R. E.; Ryan, W. B.

    2006-12-01

    estimate average sedimentation rates. In general, we found good correspondence between results of the sediment core analysis and the layers identified in the seismic data. In several cases, the availability of the lead concentration information improved interpretation of the seismic data, allowing us to avoid over/underestimation of the thickness of the recent layer. The use of both seismic and sediment core analysis provides a more detailed and reliable map of the distribution of recent sediment deposition and more accurate volume estimates.

  6. Numerical Analysis of JNES Seismic Tests on Degraded Combined Piping System

    SciTech Connect

    Zhang T.; Nie J.; Brust, F.; Wilkowski, G.; Hofmayer, C.; Ali, S.; Shim, D-J.

    2012-02-02

    Nuclear power plant safety under seismic conditions is an important consideration. The piping systems may have some defects caused by fatigue, stress corrosion cracking, etc., in aged plants. These cracks may not only affect the seismic response but also grow and break through causing loss of coolant. Therefore, an evaluation method needs to be developed to predict crack growth behavior under seismic excitation. This paper describes efforts conducted to analyze and better understand a series of degraded pipe tests under seismic loading that was conducted by Japan Nuclear Energy Safety Organization (JNES). A special 'cracked-pipe element' (CPE) concept, where the element represented the global moment-rotation response due to the crack, was developed. This approach was developed to significantly simplify the dynamic finite element analysis in fracture mechanics fields. In this paper, model validation was conducted by comparisons with a series of pipe tests with circumferential through-wall and surface cracks under different excitation conditions. These analyses showed that reasonably accurate predictions could be made using the abaqus connector element to model the complete transition of a circumferential surface crack to a through-wall crack under cyclic dynamic loading. The JNES primary loop recirculation piping test was analyzed in detail. This combined-component test had three crack locations and multiple applied simulated seismic block loadings. Comparisons were also made between the ABAQUS finite element (FE) analyses results to the measured displacements in the experiment. Good agreement was obtained, and it was confirmed that the simplified modeling is applicable to a seismic analysis for a cracked pipe on the basis of fracture mechanics. Pipe system leakage did occur in the JNES tests. The analytical predictions using the CPE approach did not predict leakage, suggesting that cyclic ductile tearing with large-scale plasticity was not the crack growth mode for

  7. Seismic response of a full-scale wind turbine tower using experimental and numerical modal analysis

    NASA Astrophysics Data System (ADS)

    Kandil, Kamel Sayed Ahmad; Saudi, Ghada N.; Eltaly, Boshra Aboul-Anen; El-khier, Mostafa Mahmoud Abo

    2016-12-01

    Wind turbine technology has developed tremendously over the past years. In Egypt, the Zafarana wind farm is currently generating at a capacity of 517 MW, making it one of the largest onshore wind farms in the world. It is located in an active seismic zone along the west side of the Gulf of Suez. Accordingly, seismic risk assessment is demanded for studying the structural integrity of wind towers under expected seismic hazard events. In the context of ongoing joint Egypt-US research project "Seismic Risk Assessment of Wind Turbine Towers in Zafarana wind Farm Egypt" (Project ID: 4588), this paper describes the dynamic performance investigation of an existing Nordex N43 wind turbine tower. Both experimental and numerical work are illustrated explaining the methodology adopted to investigate the dynamic behavior of the tower under seismic load. Field dynamic testing of the full-scale tower was performed using ambient vibration techniques (AVT). Both frequency domain and time domain methods were utilized to identify the actual dynamic properties of the tower as built in the site. Mainly, the natural frequencies, their corresponding mode shapes and damping ratios of the tower were successfully identified using AVT. A vibration-based finite element model (FEM) was constructed using ANSYS V.12 software. The numerical and experimental results of modal analysis were both compared for matching purpose. Using different simulation considerations, the initial FEM was updated to finally match the experimental results with good agreement. Using the final updated FEM, the response of the tower under the AQABA earthquake excitation was investigated. Time history analysis was conducted to define the seismic response of the tower in terms of the structural stresses and displacements. This work is considered as one of the pioneer structural studies of the wind turbine towers in Egypt. Identification of the actual dynamic properties of the existing tower was successfully performed

  8. Spatial Analysis of the Level of Exposure to Seismic Hazards of Health Facilities in Mexico City, Mexico

    NASA Astrophysics Data System (ADS)

    Moran, S.; Novelo-Casanova, D. A.

    2011-12-01

    Although health facilities are essential infrastructure during disasters and emergencies, they are also usually highly vulnerable installations in the case of the occurrence of large and major earthquakes. Hospitals are one of the most complex critical facilities in modern cities and they are used as first response in emergency situations. The operability of a hospital must be maintained after the occurrence of a local strong earthquake in order to satisfy the need for medical care of the affected population. If a health facility is seriously damaged, it cannot fulfill its function when most is needed. In this case, hospitals become a casualty of the disaster. To identify the level of physical exposure of hospitals to seismic hazards in Mexico City, we analyzed their geographic location with respect to the seismic response of the different type of soils of the city from past earthquakes, mainly from the events that occurred on September 1985 (Ms= 8.0) and April 1989 (Ms= 6.9). Seismic wave amplification in this city is the result of the interaction of the incoming seismic waves with the soft and water saturated clay soils, on which a large part of Mexico City is built. The clay soils are remnants of the lake that existed in the Valley of Mexico and which has been drained gradually to accommodate the growing urban sprawl. Hospital facilities were converted from a simple database of names and locations into a map layer of resources. This resource layer was combined with other map layers showing areas of seismic microzonation in Mexico City. This overlay was then used to identify those hospitals that may be threatened by the occurrence of a large or major seismic event. We analyzed the public and private hospitals considered as main health facilities. Our results indicate that more than 50% of the hospitals are highly exposed to seismic hazards. Besides, in most of these health facilities we identified the lack of preventive measures and preparedness to reduce their

  9. Joint analysis of infrasound and seismic signals by cross wavelet transform: detection of Mt. Etna explosive activity

    NASA Astrophysics Data System (ADS)

    Cannata, A.; Montalto, P.; Patanè, D.

    2013-06-01

    The prompt detection of explosive volcanic activity is crucial since this kind of activity can release copious amounts of volcanic ash and gases into the atmosphere, causing severe dangers to aviation. In this work, we show how the joint analysis of seismic and infrasonic data by wavelet transform coherence (WTC) can be useful to detect explosive activity, significantly enhancing its recognition that is normally done by video cameras and thermal sensors. Indeed, the efficiency of these sensors can be reduced (or inhibited) in the case of poor visibility due to clouds or gas plumes. In particular, we calculated the root mean square (RMS) of seismic and infrasonic signals recorded at Mt. Etna during 2011. This interval was characterised by several episodes of lava fountains, accompanied by lava effusion, and minor strombolian activities. WTC analysis showed significantly high values of coherence between seismic and infrasonic RMS during explosive activity, with infrasonic and seismic series in phase with each other, hence proving to be sensitive to both weak and strong explosive activity. The WTC capability of automatically detecting explosive activity was compared with the potential of detection methods based on fixed thresholds of seismic and infrasonic RMS. Finally, we also calculated the cross correlation function between seismic and infrasonic signals, which showed that the wave types causing such seismo-acoustic relationship are mainly incident seismic and infrasonic waves, likely with a common source.

  10. Source Mechanism, Stress Triggering, and Hazard Analysis of Induced Seismicity in Oil/Gas Fields in Oman and Kuwait

    NASA Astrophysics Data System (ADS)

    Gu, C.; Toksoz, M. N.; Ding, M.; Al-Enezi, A.; Al-Jeri, F.; Meng, C.

    2015-12-01

    Induced seismicity has drawn new attentions in both academia and industry in recent years as the increasing seismic activity in the regions of oil/gas fields due to fluid injection/extraction and hydraulic fracturing. Source mechanism and triggering stress of these induced earthquakes are of great importance for understanding their causes and the physics of the seismic processes in reservoirs. Previous research on the analysis of induced seismic events in conventional oil/gas fields assumed a double couple (DC) source mechanism. The induced seismic data in this study are from both Oman and Kuwait. For the Oman data, the induced seismicity is monitored by both surface network (0seismic data (0seismicity data, based on a full-waveform inversion method (Song and Toksöz, 2011). With the full moment tensor inversion results, Coulomb stress is calculated to investigate the triggering features of the induced seismicity data. Our results show a detailed evolution of 3D triggering stress in oil/gas fields from year 1999 to 2007 for Oman, and from year 2006 to 2015 for Kuwait. In addition, the local hazard corresponding to the induced seismicity in these oil/gas fields is assessed and compared to ground motion prediction due to large (M>5.0) regional tectonic earthquakes.

  11. Seismic analysis and design of buried pipelines for fault movement

    SciTech Connect

    Wang, L.R.L.; Yeh, Y.H.

    1984-06-01

    Lifelines, such as gas and oil transmission lines and water and sewer pipelines have been damaged heavily in recent earthquakes. The damages of these lifelines have caused major, catastrophic disruption of essential service to human needs. Large abrupt differential ground movements resulted at an active fault present one of the most severe earthquake effects on a buried pipeline system. Although simplified analysis procedures for buried pipelines across strike-slip fault zones causing tensive failure of the pipeline (called tensile strike-slip fault) have been proposed, the results are not accurate enough because of several assumptions involved. Furthermore, several other important failure mechanisms and parameters have not been investigated. This paper is to present the analysis procedures and results for buried pipeline subjected to tensile strike-slip fault after modifying some of the assumptions used previously. Based on the analysis results, this paper also discusses the design criteria for buried pipelines subjected to various fault movements.

  12. Nonlinear transient survival level seismic finite element analysis of Magellan ground based telescope

    NASA Astrophysics Data System (ADS)

    Griebel, Matt; Buleri, Christine; Baylor, Andrew; Gunnels, Steve; Hull, Charlie; Palunas, Povilas; Phillips, Mark

    2016-07-01

    The Magellan Telescopes are a set of twin 6.5 meter ground based optical/near-IR telescopes operated by the Carnegie Institution for Science at the Las Campanas Observatory (LCO) in Chile. The primary mirrors are f/1.25 paraboloids made of borosilicate glass and a honeycomb structure. The secondary mirror provides both f/11 and f/5 focal lengths with two Nasmyth, three auxiliary, and a Cassegrain port on the optical support structure (OSS). The telescopes have been in operation since 2000 and have experienced several small earthquakes with no damage. Measurement of in situ response of the telescopes to seismic events showed significant dynamic amplification, however, the response of the telescopes to a survival level earthquake, including component level forces, displacements, accelerations, and stresses were unknown. The telescopes are supported with hydrostatic bearings that can lift up under high seismic loading, thus causing a nonlinear response. For this reason, the typical response spectrum analysis performed to analyze a survival level seismic earthquake is not sufficient in determining the true response of the structure. Therefore, a nonlinear transient finite element analysis (FEA) of the telescope structure was performed to assess high risk areas and develop acceleration responses for future instrument design. Several configurations were considered combining different installed components and altitude pointing directions. A description of the models, methodology, and results are presented.

  13. A status report on the development of SAC2000: A new seismic analysis code

    SciTech Connect

    Goldstein, P.; Minner, L.

    1995-08-01

    We are developing a new Seismic Analysis Code (SAC2000) that will meet the research needs of the seismic research and treaty monitoring communities. Our first step in this development was to rewrite the original Seismic Analysis Code (SAC) -- a Fortran code that was approximately 140,000 lines long -- in the C programming language. This rewrite has resulted in a much more robust code that is faster, more efficient, and more portable than the original. We have implemented important processing capabilities such as convolution and binary monograms, and we have significantly enhanced several previously existing capabilities. For example, the spectrogram command now produces a correctly registered plot of the input time series and a color image of the output spectrogram. We have also added an image plotting capability with access to 17 predefined color tables or custom color tables. A rewritten version of the readcss command can now be used to access any of the documented css.3.0 database data formats, a capability that is particularly important to the Air Force Technical Applications Center (AFTAC) and the monitoring community. A much less visible, but extremely important contribution is the correction of numerous inconsistencies and errors that have evolved because of piecemeal development and limited maintenance since SAC was first written. We have also incorporated on-line documentation and have made SAC documentation available on the Internet via the world-wide-web at http://www-ep/tvp/sac.html.

  14. Detection of ancient morphology and potential hydrocarbon traps using 3-D seismic data and attribute analysis

    SciTech Connect

    Heggland, R.

    1995-12-31

    This paper presents the use of seismic attributes on 3D data to reveal Tertiary and Cretaceous geological features in Norwegian block 9/2. Some of the features would hardly be possible to map using only 2D seismic data. The method which involves a precise interpretation of horizons, attribute analysis and manipulation of colour displays, may be useful when studying morphology, faults and hydrocarbon traps. The interval of interest in this study was from 0 to 1.5 s TWT. Horizontal displays (timeslices and attribute maps), seemed to highlight very nicely geological features such as shallow channels, fractures, karst topography and faults. The attributes used for mapping these features were amplitude, total reflection energy (a volume or time interval attribute), dip and azimuth. The choice of colour scale and manipulation of colour displays were also critical for the results. The data examples clearly demonstrate how it is possible to achieve a very detailed mapping of geological features using 3D seismic data and attribute analysis. The results of this study were useful for the understanding of hydrocarbon migration paths and hydrocarbon traps.

  15. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  16. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    SciTech Connect

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  17. Supporting vulnerable families who do not attend appointments: a gap analysis of the skills health professionals need.

    PubMed

    Wallbank, Sonya; Meeusen, Mirjam; Jones, Louise

    2013-01-01

    This paper offers a framework of knowledge, skills and competencies required for professionals working with vulnerable families at risk from not attending their appointment (DNA). It also offers a gap analysis of Higher Education Health Professional courses which identifies where professionals skills need to be further developed. The gap analysis demonstrates that courses appear to teach professionals how to identify and communicate with families; however, not specifically in relation to families who DNA. One of the key factors which appears to be missing from courses is how to identify when vulnerability is increasing with a family. This may mean that families who initially present as stable may fail to be identified when their circumstances are changing and increasing their vulnerability. The gap analysis also shows that professionals are not routinely given the tools needed to creatively engage with families who do not attend. It appears important that professionals are taught why families may not attend appointments, so increasing their desire to engage with families and decrease stigmatising attitudes to families who find compliance with healthcare appointments difficult.

  18. A procedure for seismic risk reduction in Campania Region

    NASA Astrophysics Data System (ADS)

    Zuccaro, G.; Palmieri, M.; Maggiò, F.; Cicalese, S.; Grassi, V.; Rauci, M.

    2008-07-01

    The Campania Region has set and performed a peculiar procedure in the field of seismic risk reduction. Great attention has been paid to public strategic buildings such as town halls, civil protection buildings and schools. The Ordinance 3274 promulgate in the 2004 by the Italian central authority obliged the owners of strategic buildings to perform seismic analyses within 2008 in order to check the safety of the structures and the adequacy to the use. In the procedure the Campania region, instead of the local authorities, ensure the complete drafting of seismic checks through financial resources of the Italian Government. A regional scientific technical committee has been constituted, composed of scientific experts, academics in seismic engineering. The committee has drawn up guidelines for the processing of seismic analyses. At the same time, the Region has issued a public competition to select technical seismic engineering experts to appoint seismic analysis in accordance with guidelines. The scientific committee has the option of requiring additional documents and studies in order to approve the safety checks elaborated. The Committee is supported by a technical and administrative secretariat composed of a group of expert in seismic engineering. At the moment several seismic safety checks have been completed. The results will be presented in this paper. Moreover, the policy to mitigate the seismic risk, set by Campania region, was to spend the most of the financial resources available on structural strengthening of public strategic buildings rather than in safety checks. A first set of buildings of which the response under seismic action was already known by data and studies of vulnerability previously realised, were selected for immediate retrofitting designs. Secondly, an other set of buildings were identified for structural strengthening. These were selected by using the criteria specified in the Guide Line prepared by the Scientific Committee and based on

  19. Analysis of uncertainties in the entire process of tsunami vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Guillande, Richard; Gardi, Annalisa; Valencia, Nathalia; André, Camille

    2010-05-01

    In the framework of European SCHEMA project (www.schemaproject.org), whose aim is the development of a methodology for the vulnerability assessment for tsunami hazards in the Atlantic and Mediterranean area, we carried out an analysis of the uncertainties that intervene at different stages and several levels of the entire process, from post disaster field measures to hazard and damages assessment. Errors are for instance introduced when collecting post disaster observations during field survey, owing to the different measuring methods: type of instruments used, type of water marks taken into account, referential (sea level or ground), type of correction applied for tides. In some extreme cases (Banda Aceh, Indonesia), differences of several meters have been found between measures of inundation heights taken by different teams at the same locations. Other uncertainties are due to limitations of the numerical codes employed for reproducing the tsunami generation, propagation and run up. A very critical point is the accuracy of the input parameters for numerical modelling, especially the resolution of the employed DTM or DEM, which can noticeably affect the extension of the predicted inundated area. Concerning the duration of the modelled phenomenon, the comparison of five different numerical tools against a common test site led us to verify that the consistency of the computations on the long term varies sensitively depending on the code. This is particularly visible when observing the synthetic tide gauges, some of them showing maximum waves even 10 hours after the first one. This rises the problem of reliability of results for instance for emergency management in dangerous coastal strips exposed to repeated waves, where rescue teams may have to work during several hours or days. The damage assessment is carried out by means of fragility functions or matrices, which in our case have been empirically developed from data acquired where a very strong earthquake

  20. Detection, location, and analysis of earthquakes using seismic surface waves (Beno Gutenberg Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Ekström, Göran

    2015-04-01

    For shallow sources, Love and Rayleigh waves are the largest seismic phases recorded at teleseismic distances. The utility of these waves for earthquake characterization was traditionally limited to magnitude estimation, since geographically variable dispersion makes it difficult to determine useful travel-time information from the waveforms. Path delays due to heterogeneity of several tens of seconds are typical for waves at 50 sec period, and these delays must be accounted for with precision and accuracy in order to extract propagation-phase and source-phase information. Advances in tomographic mapping of global surface-wave phase velocities, and continuous growth and improvements of seismographic networks around the world, now make possible new applications of surface waves for earthquake monitoring and analysis. Through continuous back propagation of the long-period seismic wave field recorded by globally distributed stations, nearly all shallow earthquakes greater than M=5 can be detected and located with a precision of 25 km. Some of the detected events do not appear in standard earthquake catalogs and correspond to non-tectonic earthquakes, including landslides, glacier calving, and volcanic events. With the improved ability to predict complex propagation effects of surface waves across a heterogeneous Earth, moment-tensor and force representations of seismic sources can be routinely determined for all earthquakes greater than M=5 by waveform fitting of surface waves. A current area of progress in the use of surface waves for earthquake studies is the determination of precise relative locations of remote seismicity by systematic cross correlation and analysis of surface waves generated by neighboring sources. Preliminary results indicate that a location precision of 5 km may be achievable in many areas of the world.

  1. Multifractal Analysis in Mining Microseismicity and its Application to Seismic Hazard Analysis in Mines

    NASA Astrophysics Data System (ADS)

    Pasten, D.; Comte, D.; Vallejos, J.

    2013-05-01

    During the last decades several authors showing that the spatial distribution of earthquakes follows multifractal laws and the most interesting behavior is the decreasing of the fratal dimensions before the ocurrence of a large earthquake, and also before its main aftershocks. A multifractal analysis to over 55920 microseismicity events recorded from January 2006 to January 2009 at Creighton mine, Canada was applied. In order to work with a complete catalogue in magnitude, it was taken the data associated with the linear part of the Gutenber-Richter law, with magnitudes greater than -1.5. A multifractal analysis was performed using microseismic data, considering that significant earthquakes are those with magnitude MW ≥ 1.0. A moving window was used, containing a constant number of events in order to guarantee the precise estimations of the fractal dimensions. After different trials, we choose 200 events for the number of the data points in each windows. Two consecutive windows were shifted by 20 events. The complete data set was separated in six sections and this multifractal analysis was applied for each section of 9320 data. The multifractal analysis of each section shows that there is a systematic decreasing of the fractal dimension (Dq) with time before the occurrence of rockburst or natural event with magnitude greater than MW ≥ 1.0, as it is observed in the seismic sequence of large earthquakes. This metodology was repeated for minimum magnitudes MW ≥ 1.5 and MW ≥ 2.0, obtaining same results. The best result was obtained using MW >= 2.0, a right answer rate vary between fifty and eighty percent. The result shows the possibility to use systematically the determination of the Dq parameter in order to detect the next rockburst or natural event in the studied mine. This project has been financially suppoerted by FONDECyT No 3120237 Grant (D.P).

  2. Watershed Reliability, Resilience And Vulnerability Analysis Under Uncertainty Using Water Quality Data

    EPA Science Inventory

    A method for assessment of watershed health is developed by employing measures of reliability, resilience and vulnerability (R-R-V) using stream water quality data. Observed water quality data are usually sparse, so that a water quality time series is often reconstructed using s...

  3. Vulnerabilities in Bytecode Removed by Analysis, Nuanced Confinement and Diversification (VIBRANCE)

    DTIC Science & Technology

    2015-06-01

    VIBRANCE tool starts with a vulnerable Java application and automatically hardens it against SQL injection, OS command injection, file path traversal...69 3.5 Time of Check Time of Use Details ( Files ...CWE-73: External Control of File Name of Path ........................................................................................... 100 A.6 CWE

  4. Martial Arts and Socially Vulnerable Youth. An Analysis of Flemish Initiatives

    ERIC Educational Resources Information Center

    Theeboom, Marc; De Knop, Paul; Wylleman, Paul

    2008-01-01

    Notwithstanding the lack of empirical support for its positive socio-psychological effects, numerous educators and welfare workers make use of martial arts in their work with socially vulnerable youth. Using qualitative methodology, the aims, approaches and personal experiences were analysed of teachers and co-ordinators involved in specific…

  5. New Software for Long-Term Storage and Analysis of Seismic Wave Data

    NASA Astrophysics Data System (ADS)

    Cervelli, D. P.; Cervelli, P. F.; Murray, T. L.

    2004-12-01

    Large seismic networks generate a substantial quantity of data that must be first archived, and then disseminated, visualized, and analyzed, in real-time, in the office or from afar. To achieve these goals for the Alaska Volcano Observatory we developed two software packages: Winston, a database for storing seismic wave data, and Swarm, an application for analyzing and browsing the data. We also modified an existing package, Valve, an internet web-browser based interface to various data sets developed at the Hawaiian Volcano Observatory, to communicate with Winston. These programs provide users with the tools necessary to monitor many commonly used geophysical parameters. Winston, Wave Information Storage Network, uses a vendor-neutral SQL database to store seismic wave data. Winston's primary design goal was simple: develop a more robust, scalable, long-term replacement for the Earthworm waveserver. Access to data within the Winston database is through a scalable internet based server application, an Earthworm waveserver emulator, or directly via SQL queries. Some benefits of using an SQL database are easy backups and exports, speed, and reliability. Swarm, Seismic Wave Analysis and Real-time Monitor, is a stand-alone application that was designed to replace the traditional drum helicorder and computer wave viewer with an intuitive and interactive interface for rapidly assessing volcanic hazard, browsing through past data, and analyzing waveforms. Users can easily view waves in traditional analytic ways, such as frequency spectra or spectrograms, and employ standard analytic tools like filtering. Swarm allows efficient dissemination of data and breaks cross-disciplinary barriers by creating an accessible interface to seismic data for non-seismologists. Swarm currently operates with many seismic data sources including Earthworm waveservers and SEED files. Lastly, Swarm can be a valuable education and outreach tool by using its Kiosk Mode: a full-screen mode that

  6. Assessment of prey vulnerability through analysis of wolf movements and kill sites.

    PubMed

    Bergman, Eric J; Garrott, Robert A; Creel, Scott; Borkowski, John J; Jaffe, Rosemary; Watson, E G R

    2006-02-01

    Within predator-prey systems behavior can heavily influence spatial dynamics, and accordingly, the theoretical study of how spatial dynamics relate to stability within these systems has a rich history. However, our understanding of these behaviors in large mammalian systems is poorly developed. To address the relationship between predator selection patterns, prey density, and prey vulnerability, we quantified selection patterns for two fine-scale behaviors of a recovering wolf (Canis lupus) population in Yellowstone National Park, Wyoming, USA. Wolf spatial data were collected between November and May from 1998-1999 until 2001-2002. Over four winters, 244 aerial locations, 522 ground-based telemetry locations, 1287 km of movement data from snow tracking, and the locations of 279 wolf kill sites were recorded. There was evidence that elk (Cervus elaphus) and bison (Bison bison) densities had a weak effect on the sites where wolves traveled and made kills. Wolf movements showed a strong selection for geothermal areas, meadows, and areas near various types of habitat edges. Proximity to edge and habitat class also had a strong influence on the locations where elk were most vulnerable to predation. There was little evidence that wolf kill sites differed from the places where wolves traveled, indicating that elk vulnerability influenced where wolves selected to travel. Our results indicate that elk are more vulnerable to wolves under certain conditions and that wolves are capable of selecting for these conditions. As such, vulnerability plays a central role in predator-prey behavioral games and can potentially impact the systems to which they relate.

  7. Sampling and Analysis Plan - Waste Treatment Plant Seismic Boreholes Project

    SciTech Connect

    Reidel, Steve P.

    2006-05-26

    This sampling and analysis plan (SAP) describes planned data collection activities for four entry boreholes through the sediment overlying the basalt, up to three new deep rotary boreholes through the basalt and sedimentary interbeds, and one corehole through the basalt and sedimentary interbeds at the Waste Treatment Plant (WTP) site. The SAP will be used in concert with the quality assurance plan for the project to guide the procedure development and data collection activities needed to support borehole drilling, geophysical measurements, and sampling. This SAP identifies the American Society of Testing Materials standards, Hanford Site procedures, and other guidance to be followed for data collection activities.

  8. Determination of controlling earthquakes from probabilistic seismic hazard analysis for nuclear reactor sites

    SciTech Connect

    Boissonnade, A.; Bernreuter, D.; Chokshi, N.; Murphy, A.

    1995-04-04

    Recently, the US Nuclear Regulatory Commission published, for public comments, a revision to 10 CFR Part 100. The proposed regulation acknowledges that uncertainties are inherent in estimates of the Safe Shutdown Earthquake Ground Motion (SSE) and requires that these uncertainties be addressed through an appropriate analysis. One element of this evaluation is the assessment of the controlling earthquake through the probabilistic seismic hazard analysis (PSHA) and its use in determining the SSE. This paper reviews the basis for the various key choices in characterizing the controlling earthquake.

  9. Crucial role of detailed function, task, timeline, link and human vulnerability analyses in HRA. [Human Reliability Analysis (HRA)

    SciTech Connect

    Ryan, T.G.; Haney, L.N.; Ostrom, L.T.

    1992-01-01

    This paper addresses one major cause for large uncertainties in human reliability analysis (HRA) results, that is, an absence of detailed function, task, timeline, link and human vulnerability analyses. All too often this crucial step in the HRA process is done in a cursory fashion using word of mouth or written procedures which themselves may incompletely or inaccurately represent the human action sequences and human error vulnerabilities being analyzed. The paper examines the potential contributions these detailed analyses can make in achieving quantitative and qualitative HRA results which are: (1) creditable, that is, minimize uncertainty, (2) auditable, that is, systematically linking quantitative results and qualitative information from which the results are derived, (3) capable of supporting root cause analyses on human reliability factors determined to be major contributors to risk, and (4) capable of repeated measures and being combined with similar results from other analyses to examine HRA issues transcending individual systems and facilities. Based on experience analyzing test and commercial nuclear reactors, and medical applications of nuclear technology, an iterative process is suggested for doing detailed function, task, timeline, link and human vulnerability analyses using documentation reviews, open-ended and structured interviews, direct observations, and group techniques. Finally, the paper concludes that detailed analyses done in this manner by knowledgeable human factors practitioners, can contribute significantly to the credibility, auditability, causal factor analysis, and combining goals of the HRA.

  10. Seismic analysis of nailed vertical excavation using pseudo-dynamic approach

    NASA Astrophysics Data System (ADS)

    Sarangi, Piyush; Ghosh, Priyanka

    2016-12-01

    An attempt has been made to study the behavior of nailed vertical excavations in medium dense to dense cohesionless soil under seismic conditions using a pseudo-dynamic approach. The effect of several parameters such as angle of internal friction of soil ( ϕ), horizontal ( k h) and vertical ( k v) earthquake acceleration coefficients, amplification factor ( f a), length of nails ( L), angle of nail inclination ( α) and vertical spacing of nails ( S v) on the stability of nailed vertical excavations has been explored. The limit equilibrium method along with a planar failure surface is used to derive the formulation involved with the pseudo-dynamic approach, considering axial pullout of the installed nails. A comparison of the pseudo-static and pseudo-dynamic approaches has been established in order to explore the effectiveness of the pseudo-dynamic approach over pseudo-static analysis, since most of the seismic stability studies on nailed vertical excavations are based on the latter. The results are expressed in terms of the global factor of safety (FOS). Seismic stability, i.e., the FOS of nailed vertical excavations is found to decrease with increase in the horizontal and vertical earthquake forces. The present values of FOS are compared with those available in the literature.

  11. Independent Analysis of Seismicity and Rock fall Scenarios for the Yucca Mountain Repository

    SciTech Connect

    Apted, M.J.; Kemeny, J.M.; Martin, C.D.; James, R.J.

    2006-07-01

    Yucca Mountain is located in the somewhat seismically active Basin and Range province. Future seismic activity is identified by the US Nuclear Regulatory Commission and the US National Academy of Sciences as a key scenario for safety assessment of a proposed repository at Yucca Mountain. As part of its on-going program of conducting independent analyses of scientific and technical issues that could be important to the licensing of the Yucca Mountain repository, EPRI has conducted an analysis of the combined scenarios of seismic activity and stability of emplacement drifts with respect to the long-term repository safety. In this paper we present the results of 3D finite element simulations of both static and dynamic loading of a degraded waste package. For the static case, the expected maximum static load is determined by utilizing relationships between cave height and the bulking factor. A static load representing 30 meters of broken rock was simulated using the finite element model. For the dynamic case, block size and velocity data from the most recent Drift Degradation AMR are used. Based on this, a rock block with a volume of 3.11 m{sup 3} and with an impact velocity of 4.81 m/s was simulated using the finite element model. In both cases, the results indicate that the waste package remains intact. (authors)

  12. Limitations of quantitative analysis of deep crustal seismic reflection data: Examples from GLIMPCE

    USGS Publications Warehouse

    Lee, Myung W.; Hutchinson, Deborah R.

    1992-01-01

    Amplitude preservation in seismic reflection data can be obtained by a relative true amplitude (RTA) processing technique in which the relative strength of reflection amplitudes is preserved vertically as well as horizontally, after compensating for amplitude distortion by near-surface effects and propagation effects. Quantitative analysis of relative true amplitudes of the Great Lakes International Multidisciplinary Program on Crustal Evolution seismic data is hampered by large uncertainties in estimates of the water bottom reflection coefficient and the vertical amplitude correction and by inadequate noise suppression. Processing techniques such as deconvolution, F-K filtering, and migration significantly change the overall shape of amplitude curves and hence calculation of reflection coefficients and average reflectance. Thus lithological interpretation of deep crustal seismic data based on the absolute value of estimated reflection strength alone is meaningless. The relative strength of individual events, however, is preserved on curves generated at different stages in the processing. We suggest that qualitative comparisons of relative strength, if used carefully, provide a meaningful measure of variations in reflectivity. Simple theoretical models indicate that peg-leg multiples rather than water bottom multiples are the most severe source of noise contamination. These multiples are extremely difficult to remove when the water bottom reflection coefficient is large (>0.6), a condition that exists beneath parts of Lake Superior and most of Lake Huron.

  13. Near-Field Probabilistic Seismic Hazard Analysis of Metropolitan Tehran Using Region-Specific Directivity Models

    NASA Astrophysics Data System (ADS)

    Yazdani, Azad; Nicknam, Ahmad; Dadras, Ehsan Yousefi; Eftekhari, Seyed Nasrollah

    2017-01-01

    Ground motions are affected by directivity effects at near-fault regions which result in low-frequency cycle pulses at the beginning of the velocity time history. The directivity features of near-fault ground motions can lead to significant increase in the risk of earthquake-induced damage on engineering structures. The ordinary probabilistic seismic hazard analysis (PSHA) does not take into account such effects; recent studies have thus proposed new frameworks to incorporate directivity effects in PSHA. The objective of this study is to develop the seismic hazard mapping of Tehran City according to near-fault PSHA procedure for different return periods. To this end, the directivity models required in the modified PSHA were developed based on a database of the simulated ground motions. The simulated database was used in this study because there are no recorded near-fault data in the region to derive purely empirically based pulse prediction models. The results show that the directivity effects can significantly affect the estimate of regional seismic hazard.

  14. Seismic response analysis of an instrumented building structure

    USGS Publications Warehouse

    Li, H.-J.; Zhu, S.-Y.; Celebi, M.

    2003-01-01

    The Sheraton - Universal hotel, an instrumented building lying in North Hollywood, USA is selected for case study in this paper. The finite element method is used to produce a linear time - invariant structural model, and the SAP2000 program is employed for the time history analysis of the instrumented structure under the base excitation of strong motions recorded in the basement during the Northridge, California earthquake of 17 January 1994. The calculated structural responses are compared with the recorded data in both time domain and frequency domain, and the effects of structural parameters evaluation and indeterminate factors are discussed. Some features of structural response, such as the reason why the peak responses of acceleration in the ninth floor are larger than those in the sixteenth floor, are also explained.

  15. Workflow Management of the SCEC Computational Platforms for Physics-Based Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Jordan, T. H.; Callaghan, S.; Maechling, P. J.; Juve, G.; Deelman, E.; Rynge, M.; Vahi, K.; Silva, F.

    2012-12-01

    Earthquake simulation has the potential to substantially improve seismic hazard and risk forecasting, but the practicality of using simulation results is limited by the scale and complexity of the computations. Here we will focus on the experience of the Southern California Earthquake Center (SCEC) in applying workflow management tools to facilitate physics-based seismic hazard analysis. This system-level problem can be partitioned into a series of computational pathways according to causal sequences described in terms of conditional probabilities. For example, the exceedance probabilities of shaking intensities at geographically distributed sites conditional on a particular fault rupture (a ground motion prediction model or GMPM) can be combined with the probabilities of different ruptures (an earthquake rupture forecast or ERF) to create a seismic hazard map. Deterministic simulations of ground motions from very large suites (millions) of ruptures, now feasible through high-performance computational facilities such as SCEC's CyberShake Platform, are allowing seismologists to replace empirical GMPMs with physics-based models that more accurately represent wave propagation through heterogeneous geologic structures, such as the sedimentary basins that amplify seismic shaking. One iteration of the current broadband CyberShake hazard model for the Los Angeles region, which calculates ground motions deterministically up to 0.5 Hz and stochastically up to 10 Hz, requires the execution of about 3.3 billion jobs, taking 12.8 million computer hours and producing 10 TB of simulation data. We will show how the scalability and reliability of CyberShake calculations on some of the nation's largest computers has been improved using the Pegasus Workflow Management System. We will also describe the current challenges of scaling these calculations up by an order of magnitude to create a California-wide hazard model, which will be based on the new Uniform California Earthquake

  16. ANALYSIS OF DAMAGE TO WASTE PACKAGES CAUSED BY SEISMIC EVENTS DURING POST-CLOSURE

    SciTech Connect

    Alves, S W; Blair, S C; Carlson, S R; Gerhard, M; Buscheck, T A

    2008-05-27

    This paper presents methodology and results of an analysis of damage due to seismic ground motion for waste packages emplaced in a nuclear waste repository at Yucca Mountain, Nevada. A series of three-dimensional rigid body kinematic simulations of waste packages, pallets, and drip shields subjected to seismic ground motions was performed. The simulations included strings of several waste packages and were used to characterize the number, location, and velocity of impacts that occur during seismic ground motion. Impacts were categorized as either waste package-to-waste package (WP-WP) or waste package-to-pallet (WP-P). In addition, a series of simulations was performed for WP-WP and WP-P impacts using a detailed representation of a single waste package. The detailed simulations were used to determine the amount of damage from individual impacts, and to form a damage catalog, indexed according to the type, angle, location and force/velocity of the impact. Finally, the results from the two analyses were combined to estimate the total damage to a waste package that may occur during an episode of seismic ground motion. This study addressed two waste package types, four levels of peak ground velocity (PGV), and 17 ground motions at each PGV. Selected aspects of waste package degradation, such as effective wall thickness and condition of the internals, were also considered. As expected, increasing the PGV level of the vibratory ground motion increases the damage to the waste packages. Results show that most of the damage is caused by WP-P impacts. TAD-bearing waste packages with intact internals are highly resistant to damage, even at a PGV of 4.07 m/s, which is the highest level analyzed.

  17. The ZH ratio Analysis of Global Seismic Data

    NASA Astrophysics Data System (ADS)

    Yano, T.; Shikato, S.; Rivera, L.; Tanimoto, T.

    2007-12-01

    The ZH ratio, the ratio of vertical to horizontal component of the fundamental Rayleigh wave as a function of frequency, is an alternative approach to phase/group velocity analysis for constructing the S-wave velocity structure. In this study, teleseismic Rayleigh wave data for the frequency range between 0.004Hz to 0.04Hz is used to investigate the interior structure. We have analyzed most of the GEOSCOPE network data and some IRIS GSN stations using a technique developed by Tanimoto and Rivera (2007). Stable estimates of the ZH ratios were obtained for the frequency range for most stations. We have performed the inversion of the measured ZH ratios for the structure in the crust and mantle by using nonlinear iterative scheme. The depth sensitivity kernels for inversion are numerically calculated. Depth sensitivity of the lowest frequency extends to depths beyond 500 km but the sensitivity of the overall data for the frequency band extends down to about 300km. We found that an appropriate selection of an initial model, particularly the depth of Mohorovicic discontinuity, is important for this inversion. The inversion result depends on the initial model and turned out to be non-unique. We have constructed the initial model from the CRUST 2.0. Inversion with equal weighting to each data point tends to reduce variance of certain frequency range only. Therefore, we have developed a scheme to increase weighting to data points that do not fit well after the fifth iteration. This occurs more often for low frequency range, 0.004-0.007Hz. After fitting the lower frequency region, the low velocity zone around a depth of 100km is observed under some stations such as KIP (Kipapa, Hawaii) and ATD (Arta Cave, Djibouti). We have also carried out an analysis on the resolving power of data by examining the eigenvalues-eigenvectors of the least-squares problem. Unfortunately, the normal matrix usually has 1-2 very large eigenvalues, followed by much smaller eigenvalues. The third

  18. Statistical Analysis of Seismic Events Occurring on Piton de la Fournaise Volcano, La Réunion : Bringing Out Eruptive Precursors

    NASA Astrophysics Data System (ADS)

    Durand, V.; Le Bouteiller, P.; Mangeney, A.; Ferrazzini, V.; Kowalski, P.; Lauret, F.; Brunet, C.

    2015-12-01

    On Piton de la Fournaise volcano, La Reunion island, continuous seismic recordings allow to extract signals associated to rockfalls occuring inside the Dolomieu crater. Using the OVPF catalog, we have investigated these seismic signals in order to find how their characteristics could relate to physical characteristics of rockfalls. Here, we analyze such seismic signals from a statistical viewpoint. Data are first taken on an 8 month period that includes an eruption, from January to August 2014. For all the seismic signals associated to rockfalls in this period, 14 seismic and physical attributes are retrieved, allowing to perform a statistical method known as Principal Component Analysis (PCA). It is processed three times in this study : firstly with the 14 attributes, to highlight the main features that outline the data - which we find to be duration and seismic energy, as well as waveform and frequency content. The second PCA, based on 6 attributes, leads to the definition of several physical types of rockfalls, based on the k-means clustering method. Lastly, PCA and k-means clustering are done on 6 different, easily-computed seismic attributes, and reveal a specific behavior of one cluster of events just before the June 20th eruption. Based on this finding, 15 easily-retrievable numerical attributes are defined from the specific cluster pointed out in the 2014 study, and tested on 2 other datasets : from January to July 2015, they detect the approach of the February 4th, May 17th and July 31st eruptions. From August to December 2010, our attributes show precursory variations a few days before the October 14th and December 9th eruptions. We highlight the increase in a specific type of seismic events shortly before an eruption ; we believe they rather have a volcano-tectonic source but hardly distinguish themselves from rockfalls on seismic signals.

  19. Seismic clusters analysis in North-Eastern Italy by the nearest-neighbor approach

    NASA Astrophysics Data System (ADS)

    Peresan, Antonella; Gentili, Stefania

    2016-04-01

    The main features of earthquake clusters in the Friuli Venezia Giulia Region (North Eastern Italy) are explored, with the aim to get some new insights on local scale patterns of seismicity in the area. The study is based on a systematic analysis of robustly and uniformly detected seismic clusters of small-to-medium magnitude events, as opposed to selected clusters analyzed in earlier studies. To characterize the features of seismicity for FVG, we take advantage of updated information from local OGS bulletins, compiled at the National Institute of Oceanography and Experimental Geophysics, Centre of Seismological Research, since 1977. A preliminary reappraisal of the earthquake bulletins is carried out, in order to identify possible missing events and to remove spurious records (e.g. duplicates and explosions). The area of sufficient completeness is outlined; for this purpose, different techniques are applied, including a comparative analysis with global ISC data, which are available in the region for large and moderate size earthquakes. Various techniques are considered to estimate the average parameters that characterize the earthquake occurrence in the region, including the b-value and the fractal dimension of epicenters distribution. Specifically, besides the classical Gutenberg-Richter Law, the Unified Scaling Law for Earthquakes, USLE, is applied. Using the updated and revised OGS data, a new formal method for detection of earthquake clusters, based on nearest-neighbor distances of events in space-time-energy domain, is applied. The bimodality of the distribution, which characterizes the earthquake nearest-neighbor distances, is used to decompose the seismic catalog into sequences of individual clusters and background seismicity. Accordingly, the method allows for a data-driven identification of main shocks (first event with the largest magnitude in the cluster), foreshocks and aftershocks. Average robust estimates of the USLE parameters (particularly, b

  20. Insights into bedrock surface morphology using low-cost passive seismic surveys and integrated geostatistical analysis.

    PubMed

    Trevisani, S; Boaga, J; Agostini, L; Galgaro, A

    2017-02-01

    The HVSR (Horizontal to Vertical Spectral Ratio) technique is very popular in the context of seismic microzonation and for the mapping of shallow seismic reflectors, such as the sediment/bedrock transition surface. This easy-to-deploy single station passive seismic technique permits the collection of a considerable amount of HVSR data in a cost-effective way. It is not surprising that some recent studies have adopted single station micro-tremor analyses in order to retrieve information on geological structures in 1D, 2D or even 3D reconstructions. However, the interpolation approaches followed in these studies for extending the punctual HVSR data spatially are not supported by a detailed spatial statistical analysis. Conversely, in order to exploit the informative content and quantify the related uncertainty of HVSR data it is necessary to utilize a deep spatial statistical analysis and objective interpolation approaches. Moreover, the interpolation approach should make it possible to use expert knowledge and auxiliary information. Accordingly, we present an integrated geostatistical approach applied to HVSR data, collected for retrieving information on the morphology of a buried bedrock surface. The geostatistical study is conducted on an experimental dataset of 116 HVSR data collected in a small thermal basin located in the Venetian Plain (Caldiero Basin, N-E Italy). The explorative geostatistical analysis of the data coupled with the use of interpolation kriging techniques permit the extraction of relevant information on the resonance properties of the subsoil. The utilized approach, based on kriging with external drift (or its extension, i.e. regression kriging), permits the researcher to take into account auxiliary information, evaluate the related prediction uncertainty, and highlight abrupt variations in subsoil resonance frequencies. The results of the analysis are discussed, also with reflections pertaining to the geo-engineering and geo

  1. Seismic body wave separation in volcano-tectonic activity inferred by the Convolutive Independent Component Analysis

    NASA Astrophysics Data System (ADS)

    Capuano, Paolo; De Lauro, Enza; De Martino, Salvatore; Falanga, Mariarosaria; Petrosino, Simona

    2015-04-01

    One of the main challenge in volcano-seismological literature is to locate and characterize the source of volcano/tectonic seismic activity. This passes through the identification at least of the onset of the main phases, i.e. the body waves. Many efforts have been made to solve the problem of a clear separation of P and S phases both from a theoretical point of view and developing numerical algorithms suitable for specific cases (see, e.g., Küperkoch et al., 2012). Recently, a robust automatic procedure has been implemented for extracting the prominent seismic waveforms from continuously recorded signals and thus allowing for picking the main phases. The intuitive notion of maximum non-gaussianity is achieved adopting techniques which involve higher-order statistics in frequency domain., i.e, the Convolutive Independent Component Analysis (CICA). This technique is successful in the case of the blind source separation of convolutive mixtures. In seismological framework, indeed, seismic signals are thought as the convolution of a source function with path, site and the instrument response. In addition, time-delayed versions of the same source exist, due to multipath propagation typically caused by reverberations from some obstacle. In this work, we focus on the Volcano Tectonic (VT) activity at Campi Flegrei Caldera (Italy) during the 2006 ground uplift (Ciaramella et al., 2011). The activity was characterized approximately by 300 low-magnitude VT earthquakes (Md < 2; for the definition of duration magnitude, see Petrosino et al. 2008). Most of them were concentrated in distinct seismic sequences with hypocenters mainly clustered beneath the Solfatara-Accademia area, at depths ranging between 1 and 4 km b.s.l.. The obtained results show the clear separation of P and S phases: the technique not only allows the identification of the S-P time delay giving the timing of both phases but also provides the independent waveforms of the P and S phases. This is an enormous

  2. Vulnerability analysis and passenger source prediction in urban rail transit networks.

    PubMed

    Wang, Junjie; Li, Yishuai; Liu, Jingyu; He, Kun; Wang, Pu

    2013-01-01

    Based on large-scale human mobility data collected in San Francisco and Boston, the morning peak urban rail transit (URT) ODs (origin-destination matrix) were estimated and the most vulnerable URT segments, those capable of causing the largest service interruptions, were identified. In both URT networks, a few highly vulnerable segments were observed. For this small group of vital segments, the impact of failure must be carefully evaluated. A bipartite URT usage network was developed and used to determine the inherent connections between urban rail transits and their passengers' travel demands. Although passengers' origins and destinations were easy to locate for a large number of URT segments, a few show very complicated spatial distributions. Based on the bipartite URT usage network, a new layer of the understanding of a URT segment's vulnerability can be achieved by taking the difficulty of addressing the failure of a given segment into account. Two proof-of-concept cases are described here: Possible transfer of passenger flow to the road network is here predicted in the cases of failures of two representative URT segments in San Francisco.

  3. Vulnerability Analysis and Passenger Source Prediction in Urban Rail Transit Networks

    PubMed Central

    Wang, Junjie; Li, Yishuai; Liu, Jingyu; He, Kun; Wang, Pu

    2013-01-01

    Based on large-scale human mobility data collected in San Francisco and Boston, the morning peak urban rail transit (URT) ODs (origin-destination matrix) were estimated and the most vulnerable URT segments, those capable of causing the largest service interruptions, were identified. In both URT networks, a few highly vulnerable segments were observed. For this small group of vital segments, the impact of failure must be carefully evaluated. A bipartite URT usage network was developed and used to determine the inherent connections between urban rail transits and their passengers' travel demands. Although passengers' origins and destinations were easy to locate for a large number of URT segments, a few show very complicated spatial distributions. Based on the bipartite URT usage network, a new layer of the understanding of a URT segment's vulnerability can be achieved by taking the difficulty of addressing the failure of a given segment into account. Two proof-of-concept cases are described here: Possible transfer of passenger flow to the road network is here predicted in the cases of failures of two representative URT segments in San Francisco. PMID:24260355

  4. Seismic hazard assessment based on the Unified Scaling Law for Earthquakes: the Greater Caucasus

    NASA Astrophysics Data System (ADS)

    Nekrasova, A.; Kossobokov, V. G.

    2015-12-01

    Losses from natural disasters continue to increase mainly due to poor understanding by majority of scientific community, decision makers and public, the three components of Risk, i.e., Hazard, Exposure, and Vulnerability. Contemporary Science is responsible for not coping with challenging changes of Exposures and their Vulnerability inflicted by growing population, its concentration, etc., which result in a steady increase of Losses from Natural Hazards. Scientists owe to Society for lack of knowledge, education, and communication. In fact, Contemporary Science can do a better job in disclosing Natural Hazards, assessing Risks, and delivering such knowledge in advance catastrophic events. We continue applying the general concept of seismic risk analysis in a number of seismic regions worldwide by constructing regional seismic hazard maps based on the Unified Scaling Law for Earthquakes (USLE), i.e. log N(M,L) = A - B•(M-6) + C•log L, where N(M,L) is the expected annual number of earthquakes of a certain magnitude M within an seismically prone area of linear dimension L. The parameters A, B, and C of USLE are used to estimate, first, the expected maximum magnitude in a time interval at a seismically prone cell of a uniform grid that cover the region of interest, and then the corresponding expected ground shaking parameters including macro-seismic intensity. After a rigorous testing against the available seismic evidences in the past (e.g., the historically reported macro-seismic intensity), such a seismic hazard map is used to generate maps of specific earthquake risks (e.g., those based on the density of exposed population). The methodology of seismic hazard and risks assessment based on USLE is illustrated by application to the seismic region of Greater Caucasus.

  5. Seismic Studies

    SciTech Connect

    R. Quittmeyer

    2006-09-25

    This technical work plan (TWP) describes the efforts to develop and confirm seismic ground motion inputs used for preclosure design and probabilistic safety 'analyses and to assess the postclosure performance of a repository at Yucca Mountain, Nevada. As part of the effort to develop seismic inputs, the TWP covers testing and analyses that provide the technical basis for inputs to the seismic ground-motion site-response model. The TWP also addresses preparation of a seismic methodology report for submission to the U.S. Nuclear Regulatory Commission (NRC). The activities discussed in this TWP are planned for fiscal years (FY) 2006 through 2008. Some of the work enhances the technical basis for previously developed seismic inputs and reduces uncertainties and conservatism used in previous analyses and modeling. These activities support the defense of a license application. Other activities provide new results that will support development of the preclosure, safety case; these results directly support and will be included in the license application. Table 1 indicates which activities support the license application and which support licensing defense. The activities are listed in Section 1.2; the methods and approaches used to implement them are discussed in more detail in Section 2.2. Technical and performance objectives of this work scope are: (1) For annual ground motion exceedance probabilities appropriate for preclosure design analyses, provide site-specific seismic design acceleration response spectra for a range of damping values; strain-compatible soil properties; peak motions, strains, and curvatures as a function of depth; and time histories (acceleration, velocity, and displacement). Provide seismic design inputs for the waste emplacement level and for surface sites. Results should be consistent with the probabilistic seismic hazard analysis (PSHA) for Yucca Mountain and reflect, as appropriate, available knowledge on the limits to extreme ground motion at

  6. OVERVIEW ON BNL ASSESSMENT OF SEISMIC ANALYSIS METHODS FOR DEEPLY EMBEDDED NPP STRUCTURES.

    SciTech Connect

    XU,J.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H.

    2007-04-01

    A study was performed by Brookhaven National Laboratory (BNL) under the sponsorship of the U. S. Nuclear Regulatory Commission (USNRC), to determine the applicability of established soil-structure interaction analysis methods and computer programs to deeply embedded and/or buried (DEB) nuclear power plant (NPP) structures. This paper provides an overview of the BNL study including a description and discussions of analyses performed to assess relative performance of various SSI analysis methods typically applied to NPP structures, as well as the importance of interface modeling for DEB structures. There are four main elements contained in the BNL study: (1) Review and evaluation of existing seismic design practice, (2) Assessment of simplified vs. detailed methods for SSI in-structure response spectrum analysis of DEB structures, (3) Assessment of methods for computing seismic induced earth pressures on DEB structures, and (4) Development of the criteria for benchmark problems which could be used for validating computer programs for computing seismic responses of DEB NPP structures. The BNL study concluded that the equivalent linear SSI methods, including both simplified and detailed approaches, can be extended to DEB structures and produce acceptable SSI response calculations, provided that the SSI response induced by the ground motion is very much within the linear regime or the non-linear effect is not anticipated to control the SSI response parameters. The BNL study also revealed that the response calculation is sensitive to the modeling assumptions made for the soil/structure interface and application of a particular material model for the soil.

  7. Attenuation of seismic waves obtained by coda waves analysis in the West Bohemia earthquake swarm region

    NASA Astrophysics Data System (ADS)

    Bachura, Martin; Fischer, Tomas

    2014-05-01

    with depth, where 1/Qc seems to be frequency independent in depth range of upper lithosphere. Lateral changes of 1/Qc were also reported - it decreases in the south-west direction from the Novy Kostel focal zone, where the attenuation is the highest. Results from more advanced methods that allow for separation of scattering and intrinsic loss show that intrinsic loss is a dominant factor for attenuating of seismic waves in the region. Determination of attenuation due to scattering appears ambiguous due to small hypocentral distances available for the analysis, where the effects of scattering in frequency range from 1 to 24 Hz are not significant.

  8. Dynamics of the Askja caldera landslide, July 2014, from seismic signal analysis

    NASA Astrophysics Data System (ADS)

    Schöpa, Anne; Burtin, Arnaud; Hovius, Niels; Green, Robert G.

    2016-04-01

    A voluminous landslide occurred at the Askja caldera in the Icelandic highlands on July 21st, 2014. The next day, flood marks of at least ten tsunami waves, that had reached the northern shore of the caldera lake, could be mapped out. The highest flood marks were found up to 60 m above the lake level close to famous tourist spots underlining the high hazard potential of the area. Since the landslide happened at night, no direct observations of the mass movement nor of the subsequent tsunami waves in the caldera lake were made. We present the analysis of seismic data from a network of 58 seismic stations that recorded data during the event. The seismic data give valuable information on the triggering, initiation, timing, and propagation of the landslide, with additional details on precursory signals before and oscillation waves in the caldera lake after the main landslide. From the set of seismic wave forms, characteristic features were extracted that could be used for early warning proposes. The seismic data reveals that the main slope failure along the southeastern caldera wall was a large, single event starting at 23.24 UTC. The main part of the energy was released in the first two minutes followed by smaller events, before the background noise level was re-established some 40 minutes after the main failure. Subsequent mass movements, much lower in amplitude, occurred during the following hours. About 20 minutes before the main failure, the background noise level started to rise. Ground velocities were up to three times higher that the background level with dominant frequencies between 2-4 Hz. The increase in background noise level is visible in stations up to 30 km away from the landslide area. This velocity increase is followed by a prominent velocity drop five minutes before the main failure. The spatial distribution of the velocity decrease with its centre at the detachment area of the landslide has an elliptical outline with a long axis oriented NE-SW. This

  9. Integrating hematite (U-Th)/He dating, microtextural analysis, and thermomechanical modeling to date seismic slip

    NASA Astrophysics Data System (ADS)

    McDermott, R.; Ault, A. K.; Evans, J. P.; Reiners, P. W.; Shuster, D. L.

    2015-12-01

    Linking petrologic and geochronologic evidence for seismicity in the rock record is challenging, yet critical for understanding slip mechanics in natural faults, structural histories, and modern seismic hazards. We couple hematite (U-Th)/He (HeHe) dating with microtextural analysis and thermomechanical modeling to decipher this record from locally iridescent, hematite-coated fault surfaces in the seismogenic Wasatch fault zone (WFZ), Utah. Prior study of one fault surface linked textural evidence for elevated temperatures with a pattern of HeHe dates to hypothesize that this surface preserves evidence of multiple seismic slip events. New scanning electron microscopy (SEM) and HeHe data from a larger sample suite test this hypothesis. The SEM images reveal the presence of <500 nm polygonal hematite crystals at some iridescent regions, suggesting co- to post-seismic hematite annealing and recrystallization at temperatures >800 °C. Fault surface samples yield 3.8 ± 0.03 to 1.5 ± 0.1 Ma dates, with younger dates in iridescent regions. These results are younger than 88.5 ± 15.0 Ma and 10.8 ± 0.8 Ma dates from veins associated with initial hematite mineralization as well as new apatite (U-Th)/He dates of 4.0 ± 0.6 Ma-5.4±1.1 Ma that constrain the footwall thermal history. Reproducible but statistically different HeHe dates from samples on the same fault surface are consistent with prior observations. Collectively, these observations suggest that hematite He dates record rapid cooling from localized shear heating at asperities to temperatures hot enough to reset the hematite He system. Models incorporate rate-dependent friction and half-space cooling to constrain shear zone temperature evolution. Results reveal temperatures >800 °C are sufficient to reset hematite up to 200 μm from the fault surface and HeHe dates may represent patches of rate-strengthening friction during seismic slip. Ongoing work utilizes SEM to target aliquots with textural evidence for

  10. San Juan single-well seismic data analysis and modeling study

    SciTech Connect

    Daley, Tom; Wu, C.; Harris, J.M.; Daley, T.M.; Majer, E.L.

    2004-02-26

    The authors analyze single-well seismic data from the San Juan basin in Northwest New Mexico. The consistently observable events are tube-waves: direct, reflected and multiple tube-waves can be explained by the formation properties and survey geometry except for an anomalous zone with low velocity, high amplitude and horizontal polarization. To aid the data analysis, forward modeling using a variable-grid finite-difference parallel code is performed. The numerical result confirms the identified events in the field observations.

  11. FINITE ELEMENT ANALYSIS OF JNES/NUPEC SEISMIC SHEAR WALL CYCLIC AND SHAKING TABLE TEST DATA.

    SciTech Connect

    XU,J.; NIE, J.; HOFMAYER, C.; ALI, S.

    2007-04-12

    This paper describes a finite element analysis to predict the JNES/NUPEC cyclic and shaking table RC shear wall test data, as part of a collaborative agreement between the U.S. NRC and JNES to study seismic issues important to the safe operation of commercial nuclear power plant (NPP) structures, systems and components (SSC). The analyses described in this paper were performed using ANACAP reinforced concrete models. The paper describes the ANACAP analysis models and discusses the analysis comparisons with the test data. The ANACAP capability for modeling nonlinear cyclic characteristics of reinforced concrete shear wall structures was confirmed by the close comparisons between the ANACAP analysis results and the JNES/NUPEC cyclic test data. Reasonable agreement between the analysis results and the test data was demonstrated for the hysteresis loops and the shear force orbits, in terms of both the overall shape and the cycle-to-cycle comparisons. The ANACAP simulation analysis of the JNES/NUPEC shaking table test was also performed, which demonstrated that the ANACAP dynamic analysis with concrete material model is able to capture the progressive degrading behavior of the shear wall as indicated from the test data. The ANACAP analysis also predicted the incipient failure of the shear wall, reasonably close to the actual failure declared for the test specimen. In summary, the analyses of the JNES/NUPEC cyclic and shaking table RC shear wall tests presented in this paper have demonstrated the state-of-the-art analysis capability for determining the seismic capacity of RC shear wall structures.

  12. Topological performance measures as surrogates for physical flow models for risk and vulnerability analysis for electric power systems.

    PubMed

    LaRocca, Sarah; Johansson, Jonas; Hassel, Henrik; Guikema, Seth

    2015-04-01

    Critical infrastructure systems must be both robust and resilient in order to ensure the functioning of society. To improve the performance of such systems, we often use risk and vulnerability analysis to find and address system weaknesses. A critical component of such analyses is the ability to accurately determine the negative consequences of various types of failures in the system. Numerous mathematical and simulation models exist that can be used to this end. However, there are relatively few studies comparing the implications of using different modeling approaches in the context of comprehensive risk analysis of critical infrastructures. In this article, we suggest a classification of these models, which span from simple topologically-oriented models to advanced physical-flow-based models. Here, we focus on electric power systems and present a study aimed at understanding the tradeoffs between simplicity and fidelity in models used in the context of risk analysis. Specifically, the purpose of this article is to compare performance estimates achieved with a spectrum of approaches typically used for risk and vulnerability analysis of electric power systems and evaluate if more simplified topological measures can be combined using statistical methods to be used as a surrogate for physical flow models. The results of our work provide guidance as to appropriate models or combinations of models to use when analyzing large-scale critical infrastructure systems, where simulation times quickly become insurmountable when using more advanced models, severely limiting the extent of analyses that can be performed.

  13. Performance Analysis of Tandem-L Mission for Modeling Volcanic and Seismic Deformation Sources

    NASA Astrophysics Data System (ADS)

    Ansari, Homa; Goel, Kanika; Parizzi, Alessandro; Sudhaus, Henriette; Adam, Nico; Eineder, Michael

    2015-04-01

    Although a great number of publications have focused on the application of InSAR in deformation source modeling as well as the development of different algorithms in this regard, little investigation has been dedicated to the sensitivity analysis of the InSAR in deformation source modeling. Our purpose is to address this issue by analyzing the reliability of InSAR in modeling the deformation sources due to landslides, seismic and volcanic activities, with special focus on the L band SAR measurements. The sensitivity analysis is considered for three commonly used geophysical models in case of subsidence, seismic and volcanic activities; namely, the Gaussian subsidence bowl, Okada and Mogi point source, respectively. In each of the cases, the InSAR sensitivity is analytically formulated and its performance is investigated using simulated SAR data. The investigations are carried out using stochastic error propagation approaches to infer the precision of the models' parameters as well as their mutual covariance. The limiting factors in SAR interferometry are categorized in two groups and investigated separately in sensitivity analysis; with the first dealing with the geometrical limits imposed by the side looking geometry of the SAR measurements and the second focusing on the InSAR stochastic characteristics in the L band.

  14. Multi Canister Overpack (MCO) Handling Machine Independent Review of Seismic Structural Analysis

    SciTech Connect

    SWENSON, C.E.

    2000-09-22

    The following separate reports and correspondence pertains to the independent review of the seismic analysis. The original analysis was performed by GEC-Alsthom Engineering Systems Limited (GEC-ESL) under subcontract to Foster-Wheeler Environmental Corporation (FWEC) who was the prime integration contractor to the Spent Nuclear Fuel Project for the Multi-Canister Overpack (MCO) Handling Machine (MHM). The original analysis was performed to the Design Basis Earthquake (DBE) response spectra using 5% damping as required in specification, HNF-S-0468 for the 90% Design Report in June 1997. The independent review was performed by Fluor-Daniel (Irvine) under a separate task from their scope as Architect-Engineer of the Canister Storage Building (CSB) in 1997. The comments were issued in April 1998. Later in 1997, the response spectra of the Canister Storage Building (CSB) was revised according to a new soil-structure interaction analysis and accordingly revised the response spectra for the MHM and utilized 7% damping in accordance with American Society of Mechanical Engineers (ASME) NOG-1, ''Rules for Construction of Overhead and Gantry Cranes (Top Running Bridge, Multiple Girder).'' The analysis was re-performed to check critical areas but because manufacturing was underway, designs were not altered unless necessary. FWEC responded to SNF Project correspondence on the review comments in two separate letters enclosed. The dispositions were reviewed and accepted. Attached are supplier source surveillance reports on the procedures and process by the engineering group performing the analysis and structural design. All calculation and analysis results are contained in the MHM Final Design Report which is part of the Vendor Information File 50100. Subsequent to the MHM supplier engineering analysis, there was a separate analyses for nuclear safety accident concerns that used the electronic input data files provided by FWEC/GEC-ESL and are contained in document SNF-6248

  15. InSAR Analysis of Induced Seismicity: Examples From Southern Colorado

    NASA Astrophysics Data System (ADS)

    Barnhart, W. D.

    2015-12-01

    We present interferormetric synthetic aperture radar (InSAR) analysis of human-induced ground deformation in the Raton Basin of southern Colorado and northern New Mexico, including displacements from a wastewater injection-induced earthquake. Geodetic observations of both seismic and aseismic surface displacements provide an additional tool to further constrain spatially and temporally variable deformation within these basins. Using Envisat observations, we image co-seismic surface displacements of the 2011 Trinidad earthquake and find that the earthquake slipped within the crystalline basement underlying basin sedimentary rocks and in the vicinity of high-volume wastewater injection wells. The spatial and temporal separation between the earthquake and the wastewater wells suggests a pore pressure migration triggering mechanism is present. The finite slip distributions further highlight the location and orientation of previously unmapped, seismogenic faults. Lastly, the precise earthquake location afforded by InSAR observations provides a well-located earthquake source that can be used to calibrate other regional earthquakes locations. Additionally, we derive InSAR time series observations from ALOS imagery acquired from 2007-2011. These results highlight ongoing regions of surface subsidence within the basin, presumably caused by extraction of coal-bed methane and water that is later reinjected. While it is not clear if there is a causative relationship between regions of co-located surface subsidence and recorded earthquakes, the time series permits us to exclude several other hypotheses for the causes of increased seismicity in the Raton Basin, including volcanic activity related to the Rio Grande Rift. Furthermore, the InSAR time series analysis provides a calibration source for hydrological models that assess subsurface stress changes from the removal and injection of fluids. Forthcoming work will provide a detailed time series of surface deformation occurring

  16. CyberShake: Broadband Physics-Based Probabilistic Seismic Hazard Analysis in Southern California

    NASA Astrophysics Data System (ADS)

    Callaghan, S.; Maechling, P. J.; Milner, K.; Graves, R. W.; Donovan, J.; Wang, F.; Jordan, T. H.

    2012-12-01

    Researchers at the Southern California Earthquake Center (SCEC) have developed and used the CyberShake computational platform to perform probabilistic seismic hazard analysis (PSHA) in the Los Angeles region (Graves et al., 2010) using deterministic wave propagation simulations at frequencies up to 0.5 Hz, combined with stochastic methods, to produce broadband seismograms up to 10 Hz. CyberShake uses seismic reciprocity to calculate synthetic seismograms for a suite of more than 600,000 rupture realizations. From this set of seismograms we compute intensity measures, which are then combined into a PSHA hazard curve for the site of interest at various periods. With the CyberShake computational platform, we have computed broadband hazard curves for locations around Southern California, including precariously balanced rock sites and locations of Southern California Seismic Network stations. Additionally, for each location we calculated hazard curves with two different community velocity models, Community Velocity Model - Harvard (CVM-H) v11.2 and Community Velocity Model - SCEC (CVM-S) v11.2. At lower frequencies, hazard levels computed with CVM-H for sites within the deep LA basin are lower than those computed with CVM-S. On the other hand, sites within the Ventura basin show the opposite trend. We interpret these results to be related to the underlying nature of the velocity models, which we are continuing to investigate. At higher frequencies, the CyberShake results tend to be lower than hazard levels computed with traditional ground motion prediction equations (GMPEs). We will report on these results, possibly due to the long tail on GMPEs. Additionally, we will describe ways these results are being used by the SCEC community, such as in earthquake early warning, precarious rock analysis, and directivity-basin coupling.

  17. Ice shelf structure derived from dispersion curve analysis of ambient seismic noise, Ross Ice Shelf, Antarctica

    NASA Astrophysics Data System (ADS)

    Diez, A.; Bromirski, P. D.; Gerstoft, P.; Stephen, R. A.; Anthony, R. E.; Aster, R. C.; Cai, C.; Nyblade, A.; Wiens, D. A.

    2016-05-01

    An L-configured, three-component short period seismic array was deployed on the Ross Ice Shelf, Antarctica during November 2014. Polarization analysis of ambient noise data from these stations shows linearly polarized waves for frequency bands between 0.2 and 2 Hz. A spectral peak at about 1.6 Hz is interpreted as the resonance frequency of the water column and is used to estimate the water layer thickness below the ice shelf. The frequency band from 4 to 18 Hz is dominated by Rayleigh and Love waves propagating from the north that, based on daily temporal variations, we conclude were generated by field camp activity. Frequency-slowness plots were calculated using beamforming. Resulting Love and Rayleigh wave dispersion curves were inverted for the shear wave velocity profile within the firn and ice to ˜150 m depth. The derived density profile allows estimation of the pore close-off depth and the firn-air content thickness. Separate inversions of Rayleigh and Love wave dispersion curves give different shear wave velocity profiles within the firn. We attribute this difference to an effective anisotropy due to fine layering. The layered structure of firn, ice, water and the seafloor results in a characteristic dispersion curve below 7 Hz. Forward modelling the observed Rayleigh wave dispersion curves using representative firn, ice, water and sediment structures indicates that Rayleigh waves are observed when wavelengths are long enough to span the distance from the ice shelf surface to the seafloor. The forward modelling shows that analysis of seismic data from an ice shelf provides the possibility of resolving ice shelf thickness, water column thickness and the physical properties of the ice shelf and underlying seafloor using passive-source seismic data.

  18. Analysis and modeling of high-resolution multicomponent seismic reflection data

    NASA Astrophysics Data System (ADS)

    Guy, Erich D.

    The facts that seismic body-wave types are sensitive to different physical properties, seismic sources radiate polarized waves, and seismic receivers are sensitive to the polarization of scattered body-waves and coherent noise, mean that it is important to consider recording and analyzing different wave-types and data components prior to high-resolution reflection surveys. In this dissertation, important aspects of elastic-wave propagation relevant to high-resolution multicomponent surveying have been analyzed experimentally and numerically, and methodologies have been tested and developed that will improve near-surface imaging and characterization. Factors affecting the ability of common-mode P- and S-wave reflection surveys for mapping features in the near-surface are described and illustrated through analyses of experimental field data and modeling. It is demonstrated through comparisons of known subsurface conditions and processed stacked sections, that combined P- and S-wave common-mode reflection information can allow a geologic sequence to be imaged more effectively than by using solely P- or S-wave reflection information. Near-surface mode-converted seismic reflection imaging potential was tested experimentally and evaluated through modeling. Modeling results demonstrate that potential advantages of near-surface mode-conversion imaging can be realized in theory. Analyses of acquired multicomponent data however demonstrate that mode-conversion imaging could not be accomplished in the field study area, due to the low amplitudes of events and the presence of noise in field data. Analysis methods are presented that can be used for assessing converted-wave imaging potential in future reflection studies. Factors affecting the ability of SH-wave reflection measurements for allowing near-surface interfaces and discontinuities to be effectively imaged are described. A SH-wave reflection data analysis workflow is presented that provides a methodology for delineating

  19. Best estimate method versus evaluation method: a comparison of two techniques in evaluating seismic analysis and design. Technical report

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-07-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the tradditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC)--seismic input, soil-structure interaction, major structural response, and subsystem response--are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on the model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evauation Method is also demonstrated.

  20. Large-scale experiments for the vulnerability analysis of buildings impacted and intruded by fluviatile torrential hazard processes

    NASA Astrophysics Data System (ADS)

    Sturm, Michael; Gems, Bernhard; Fuchs, Sven; Mazzorana, Bruno; Papathoma-Köhle, Maria; Aufleger, Markus

    2016-04-01

    In European mountain regions, losses due to torrential hazards are still considerable high despite the ongoing debate on an overall increasing or decreasing trend. Recent events in Austria severely revealed that due to technical and economic reasons, an overall protection of settlements in the alpine environment against torrential hazards is not feasible. On the side of the hazard process, events with unpredictable intensities may represent overload scenarios for existent protection structures in the torrent catchments. They bear a particular risk of significant losses in the living space. Although the importance of vulnerability is widely recognised, there is still a research gap concerning its assessment. Currently, potential losses at buildings due to torrential hazards and their comparison with reinstatement costs are determined by the use of empirical functions. Hence, relations of process intensities and the extent of losses, gathered by the analysis of historic hazard events and the information of object-specific restoration values, are used. This approach does not represent a physics-based and integral concept since relevant and often crucial processes, as the intrusion of the fluid-sediment-mixture into elements at risk, are not considered. Based on these findings, our work is targeted at extending these findings and models of present risk research in the context of an integral, more physics-based vulnerability analysis concept. Fluviatile torrential hazard processes and their impacts on the building envelope are experimentally modelled. Material intrusion processes are thereby explicitly considered. Dynamic impacts are gathered quantitatively and spatially distributed by the use of a large set of force transducers. The experimental tests are accomplished with artificial, vertical and skewed plates, including also openings for material intrusion. Further, the impacts on specific buildings within the test site of the work, the fan apex of the Schnannerbach

  1. Seismic attribute analysis to enhance detection of thin gold-bearing reefs: South Deep gold mine, Witwatersrand basin, South Africa

    NASA Astrophysics Data System (ADS)

    Manzi, M. S. D.; Hein, K. A. A.; Durrheim, R.; King, N.

    2013-11-01

    The gold-bearing Upper Elsburg Reef clastic wedge (UER) in the South Deep gold mine in the Witwatersrand basin (South Africa) hosts the highly auriferous basal conglomerate known as the Elsburg Conglomerate (EC) reef. The reef is less than 20 m thick and together with quartzite and conglomerate beds in the UER (1-120 m thick) is below the seismic tuning thickness, or the dominant quarter wavelength. They are extremely difficult to identify on migrated seismic sections using traditional amplitude interpretations. In order to enhance the detection of the EC reef and its subcrop position against the overlying Ventersdorp Contact Reef (VCR), complex-trace seismic attributes, or instantaneous attributes and volume attribute analysis were applied on prestack time migrated (PSTM) seismic sections. In particular, the instantaneous phase and paraphase allowed the clear identification of the continuity of the EC reef, and overlapping and interfering wavelets produced by the convergence of VCR and the EC reef. In addition, these attributes increased confidence in the interpretation of the EC, in particular its offsets (faults), and its depth. A high correlation between the seismically determined depth of the EC reef and borehole intersections was observed, with several depth discrepancies below the vertical seismic resolution limit (~ 25 m). This information can now be incorporated into the current mine geological model, thus improving the resource evaluation of the Upper Elsburg Reef in the South Deep gold mine.

  2. Structure of Suasselkä Postglacial Fault in northern Finland obtained by analysis of ambient seismic noise

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena

    2016-04-01

    Understanding inner structure of seismogenic faults and their ability to reactivate is particularly important in investigating the continental intraplate seismicity regime. In our study we address this problem using analysis of ambient seismic noise recorded by the temporary DAFNE array in northern Fennoscandian Shield. The main purpose of the DAFNE/FINLAND passive seismic array experiment was to characterize the present-day seismicity of the Suasselkä post-glacial fault (SPGF) that was proposed as one potential target for the DAFNE (Drilling Active Faults in Northern Europe) project. The DAFNE/FINLAND array comprised the area of about 20 to 100 km and consisted of 8 short-period and 4 broad-band 3-component autonomous seismic stations installed in the close vicinity of the fault area. The array recorded continuous seismic data during September, 2011-May, 2013. Recordings of the array have being analyzed in order to identify and locate natural earthquakes from the fault area and to discriminate them from the blasts in the Kittilä Gold Mine. As a result, we found several dozens of natural seismic events originating from the fault area, which proves that the fault is still seismically active. In order to study the inner structure of the SPGF we use cross-correlation of ambient seismic noise recorded by the array. Analysis of azimuthal distribution of noise sources demonstrated that that during the time interval under consideration the distribution of noise sources is close to the uniform one. The continuous data were processed in several steps including single station data analysis, instrument response removal and time-domain stacking. The data were used to estimate empirical Green's functions between pairs of stations in the frequency band of 0.1-1 Hz and to calculate correspondent surface wave dispersion curves. After that S-wave velocity models were obtained as a result of dispersion curves inversion using Geopsy software. The results suggest that the area of

  3. Geodynamic Evolution of Northeastern Tunisia During the Maastrichtian-Paleocene Time: Insights from Integrated Seismic Stratigraphic Analysis

    NASA Astrophysics Data System (ADS)

    Abidi, Oussama; Inoubli, Mohamed Hédi; Sebei, Kawthar; Amiri, Adnen; Boussiga, Haifa; Nasr, Imen Hamdi; Salem, Abdelhamid Ben; Elabed, Mahmoud

    2016-12-01

    The Maastrichtian-Paleocene El Haria formation was studied and defined in Tunisia on the basis of outcrops and borehole data; few studies were interested in its three-dimensional extent. In this paper, the El Haria formation is reviewed in the context of a tectono-stratigraphic interval using an integrated seismic stratigraphic analysis based on borehole lithology logs, electrical well logging, well shots, vertical seismic profiles and post-stack surface data. Seismic analysis benefits from appropriate calibration with borehole data, conventional interpretation, velocity mapping, seismic attributes and post-stack model-based inversion. The applied methodology proved to be powerful for charactering the marly Maastrichtian-Paleocene interval of the El Haria formation. Migrated seismic sections together with borehole measurements are used to detail the three-dimensional changes in thickness, facies and depositional environment in the Cap Bon and Gulf of Hammamet regions during the Maastrichtian-Paleocene time. Furthermore, dating based on their microfossil content divulges local and multiple internal hiatuses within the El Haria formation which are related to the geodynamic evolution of the depositional floor since the Campanian stage. Interpreted seismic sections display concordance, unconformities, pinchouts, sedimentary gaps, incised valleys and syn-sedimentary normal faulting. Based on the seismic reflection geometry and terminations, seven sequences are delineated. These sequences are related to base-level changes as the combination of depositional floor paleo-topography, tectonic forces, subsidence and the developed accommodation space. These factors controlled the occurrence of the various parts of the Maastrichtian-Paleocene interval. Detailed examinations of these deposits together with the analysis of the structural deformation at different time periods allowed us to obtain a better understanding of the sediment architecture in depth and the delineation of

  4. Time-lapse seismic waveform modelling and attribute analysis using hydromechanical models for a deep reservoir undergoing depletion

    NASA Astrophysics Data System (ADS)

    He, Y.-X.; Angus, D. A.; Blanchard, T. D.; Wang, G.-L.; Yuan, S.-Y.; Garcia, A.

    2016-04-01

    Extraction of fluids from subsurface reservoirs induces changes in pore pressure, leading not only to geomechanical changes, but also perturbations in seismic velocities and hence observable seismic attributes. Time-lapse seismic analysis can be used to estimate changes in subsurface hydromechanical properties and thus act as a monitoring tool for geological reservoirs. The ability to observe and quantify changes in fluid, stress and strain using seismic techniques has important implications for monitoring risk not only for petroleum applications but also for geological storage of CO2 and nuclear waste scenarios. In this paper, we integrate hydromechanical simulation results with rock physics models and full-waveform seismic modelling to assess time-lapse seismic attribute resolution for dynamic reservoir characterization and hydromechanical model calibration. The time-lapse seismic simulations use a dynamic elastic reservoir model based on a North Sea deep reservoir undergoing large pressure changes. The time-lapse seismic traveltime shifts and time strains calculated from the modelled and processed synthetic data sets (i.e. pre-stack and post-stack data) are in a reasonable agreement with the true earth models, indicating the feasibility of using 1-D strain rock physics transform and time-lapse seismic processing methodology. Estimated vertical traveltime shifts for the overburden and the majority of the reservoir are within ±1 ms of the true earth model values, indicating that the time-lapse technique is sufficiently accurate for predicting overburden velocity changes and hence geomechanical effects. Characterization of deeper structure below the overburden becomes less accurate, where more advanced time-lapse seismic processing and migration is needed to handle the complex geometry and strong lateral induced velocity changes. Nevertheless, both migrated full-offset pre-stack and near-offset post-stack data image the general features of both the overburden and

  5. Blind Source Separation of Seismic Events with Independent Component Analysis: CTBT related exercise

    NASA Astrophysics Data System (ADS)

    Rozhkov, Mikhail; Kitov, Ivan

    2015-04-01

    Blind Source Separation (BSS) methods used in signal recovery applications are attractive for they use minimal a priori information about the signals they are dealing with. Homomorphic deconvolution and cepstrum estimation are probably the only methods used in certain extent in CTBT applications that can be attributed to the given branch of technology. However Expert Technical Analysis (ETA) conducted in CTBTO to improve the estimated values for the standard signal and event parameters according to the Protocol to the CTBT may face problems which cannot be resolved with certified CTBTO applications and may demand specific techniques not presently used. The problem to be considered within the ETA framework is the unambiguous separation of signals with close arrival times. Here, we examine two scenarios of interest: (1) separation of two almost co-located explosions conducted within fractions of seconds, and (2) extraction of explosion signals merged with wavetrains from strong earthquake. The importance of resolving the problem related to case 1 is connected with the correct explosion yield estimation. Case 2 is a well-known scenario of conducting clandestine nuclear tests. While the first case can be approached somehow with the means of cepstral methods, the second case can hardly be resolved with the conventional methods implemented at the International Data Centre, especially if the signals have close slowness and azimuth. Independent Component Analysis (in its FastICA implementation) implying non-Gaussianity of the underlying processes signal's mixture is a blind source separation method that we apply to resolve the mentioned above problems. We have tested this technique with synthetic waveforms, seismic data from DPRK explosions and mining blasts conducted within East-European platform as well as with signals from strong teleseismic events (Sumatra, April 2012 Mw=8.6, and Tohoku, March 2011 Mw=9.0 earthquakes). The data was recorded by seismic arrays of the

  6. Analysis of worldwide earthquake mortality using multivariate demographic and seismic data.

    PubMed

    Gutiérrez, E; Taucer, F; De Groeve, T; Al-Khudhairy, D H A; Zaldivar, J M

    2005-06-15

    In this paper, mortality in the immediate aftermath of an earthquake is studied on a worldwide scale using multivariate analysis. A statistical method is presented that analyzes reported earthquake fatalities as a function of a heterogeneous set of parameters selected on the basis of their presumed influence on earthquake mortality. The ensemble was compiled from demographic, seismic, and reported fatality data culled from available records of past earthquakes organized in a geographic information system. The authors consider the statistical relation between earthquake mortality and the available data ensemble, analyze the validity of the results in view of the parametric uncertainties, and propose a multivariate mortality analysis prediction method. The analysis reveals that, although the highest mortality rates are expected in poorly developed rural areas, high fatality counts can result from a wide range of mortality ratios that depend on the effective population size.

  7. Numerical analysis of seismic events distributions on the planetary scale and celestial bodies astrometrical parameters

    NASA Astrophysics Data System (ADS)

    Bulatova, Dr.

    2012-04-01

    Modern research in the domains of Earth sciences is developing from the descriptions of each individual natural phenomena to the systematic complex research in interdisciplinary areas. For studies of its kind in the form numerical analysis of three-dimensional (3D) systems, the author proposes space-time Technology (STT), based on a Ptolemaic geocentric system, consist of two modules, each with its own coordinate system: (1) - 3D model of a Earth, the coordinates of which provides databases of the Earth's events (here seismic), and (2) - a compact model of the relative motion of celestial bodies in space - time on Earth known as the "Method of a moving source" (MDS), which was developed in MDS (Bulatova, 1998-2000) for the 3D space. Module (2) was developed as a continuation of the geocentric Ptolemaic system of the world, built on the astronomical parameters heavenly bodies. Based on the aggregation data of Space and Earth Sciences, systematization, and cooperative analysis, this is an attempt to establish a cause-effect relationship between the position of celestial bodies (Moon, Sun) and Earth's seismic events.

  8. Investigation of the nonlinear seismic behavior of knee braced frames using the incremental dynamic analysis method

    NASA Astrophysics Data System (ADS)

    Sheidaii, Mohammad Reza; TahamouliRoudsari, Mehrzad; Gordini, Mehrdad

    2016-06-01

    In knee braced frames, the braces are attached to the knee element rather than the intersection of beams and columns. This bracing system is widely used and preferred over the other commonly used systems for reasons such as having lateral stiffness while having adequate ductility, damage concentration on the second degree convenience of repairing and replacing of these elements after Earthquake. The lateral stiffness of this system is supplied by the bracing member and the ductility of the frame attached to the knee length is supplied through the bending or shear yield of the knee member. In this paper, the nonlinear seismic behavior of knee braced frame systems has been investigated using incremental dynamic analysis (IDA) and the effects of the number of stories in a building, length and the moment of inertia of the knee member on the seismic behavior, elastic stiffness, ductility and the probability of failure of these systems has been determined. In the incremental dynamic analysis, after plotting the IDA diagrams of the accelerograms, the collapse diagrams in the limit states are determined. These diagrams yield that for a constant knee length with reduced moment of inertia, the probability of collapse in limit states heightens and also for a constant knee moment of inertia with increasing length, the probability of collapse in limit states increases.

  9. Analysis of the 2012-2013 Torreperogil-Sabiote seismic swarm

    NASA Astrophysics Data System (ADS)

    Hamdache, M.; Peláez, J. A.; Henares, J.; Damerdji, Y.; Sawires, R.

    2016-10-01

    This study analyses the temporal clustering, spatial clustering, and statistics of the 2012-2013 Torreperogil-Sabiote (southern Spain) seismic swarm. During the swarm, more than 2200 events were located, mostly at depths of 2-5 km, with magnitude event up to mbLg 3.9 (Mw 3.7). On the basis of daily activity rate, three main temporal phases are identified and analysed. The analysis combines different seismological relationships to improve our understanding of the physical processes related to the swarm's occurrence. Each temporal phase is characterized by its cumulative seismic moment. Using several different approaches, we estimate a catalog completeness magnitude of mc≅ 1.5. The maximum likelihood b-value estimates for each swarm phase are 1.11 ± 0.09, 1.04 ± 0.04, and 0.90 ± 0.04, respectively. To test the hypothesis that a b-value decrease is a precursor to a large event, we study temporal variations in b-value using overlapping moving windows. A relationship can be inferred between change in b-value and the regime style of the rupture. b-values are indicators of the stress regime, and influence the size of ruptures. The fractal dimension D2 is used to perform spatial analysis. Cumulative gamma and beta functions are used to analyse the behaviour of inter-event distances during the earthquake sequence.

  10. Seismic facies analysis of lacustrine system: Paleocene upper Fort Union Formation, Wind River basin, Wyoming

    SciTech Connect

    Liro, L.M.; Pardus, Y.C.

    1989-03-01

    The authors interpreted seismic reflection data, supported by well control, to reconstruct the stratigraphic development of Paleocene Lake Waltman in the Wind River basin of Wyoming. After dividing the upper Fort Union into eight seismic sequences, the authors mapped seismic attributes (amplitude, continuity, and frequency) within each sequence. Interpretation of the variation in seismic attributes allowed them to detail delta development and encroachment into Lake Waltman during deposition of the upper Fort Union Formation. These deltas are interpreted as high-energy, well-differentiated lobate forms with distinct clinoform morphology on seismic data. Prograding delta-front facies are easily identified on seismic data as higher amplitude, continuous events within the clinoforms. Seismic data clearly demonstrate the time-Transgressive nature of this facies. Downdip of these clinoforms, homogeneous shales, as evidenced by low-amplitude, generally continuous seismic events, accumulated in an interpreted quiet, areally extensive lacustrine setting. Seismic definition of the lateral extent of this lacustrine facies is excellent, allowing them to effectively delineate changes in the lake morphology during deposition of the upper Fort Union Formation. Encasing the upper Fort Union lacustrine deposits are fluvial-alluvial deposits, interpreted from discontinuous, variable-amplitude seismic facies. The authors highlight the correlation of seismic facies data and interpretation to well log data in the Frenchie Draw field to emphasize the accuracy of depositional environment prediction from seismic data.

  11. Observed Seismic Vulnerability of Italian Buildings

    SciTech Connect

    Rota, Maria; Magenes, Guido; Penna, Andrea; Strobbia, Claudio L.

    2008-07-08

    A very large database of post-earthquake building inspections carried out after the main Italian events of the last 30 years has been processed in order to derive fragility curves for 23 building typologies, mostly referring to masonry structures. The records (more than 91000) of this very complete and homogeneous dataset have been converted into a single damage scale with five levels of damage, plus the case of no damage. For each affected municipality a value of PGA and Housner Intensity (I{sub H}) has been evaluated using attenuation laws. Damage probability matrices have been then extracted. These experimental data have been fitted through lognormal fragility curves using an advanced nonlinear regression algorithm also taking into account the relative reliability of each point by the bootstrap technique. The significant concentration of experimental data at low levels of ground motion, associated to the selected analytical expression, determine the peculiar shape of some of the curves, with a very steep initial branch followed by an almost horizontal curve for increasing values of ground motion. Explanations and possible solutions are discussed.

  12. Time Lapse Storey Building Early Monitoring Based on Rapid Seismic Response Analysis in Indonesia

    NASA Astrophysics Data System (ADS)

    Julius, A. M.

    2015-12-01

    Within the last decade, advances in the acquisition, processing and transmission of data from seismic monitoring has contributed to the growth in the number structures instrumented with such systems. An equally important factor for such growth can be attributed to the demands by stakeholders to find rapid answers to important questions related to the functionality or state of "health" of structures during and immediately of a seismic events. Consequently, this study aims to monitor the storey building based on seismic response i. e. earthquake and tremor analysis at short time lapse using accelerographs data. This study used one of storey building (X) in Jakarta city that suffered the effects of Kebumen earthquake January 25th 2014, Pandeglang earthquake July 9th 2014, and Lebak earthquake November 8th 2014. Tremors used in this study are tremors after the three following earthquakes. Data processing used to determine peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), spectral acceleration (SA), spectral velocity (SV), spectral displacement (SD), A/V ratio, acceleration amplification and effective duration (te). Then determine the natural frequency (f0) and peak of H/V ratio using H/V ratio method. The earthquakes data processing result shows the value of peak ground motion, spectrum response, A/V ratio and acceleration amplification increases with height, while the value of the effective duration decreases. Then, tremors data processing result one month after each earthquakes shows the natural frequency of building in constant value. Increasing of peak ground motion, spectrum response, A/V ratio, acceleration amplification, then decrease of effective duration following the increase of building floors shows that the building construction supports the increasing of shaking and strongly influenced by local site effect. The constant value of building natural frequency shows the building still in good performance. This

  13. A Comparative Study on Seismic Analysis of Bangladesh National Building Code (BNBC) with Other Building Codes

    NASA Astrophysics Data System (ADS)

    Bari, Md. S.; Das, T.

    2013-09-01

    Tectonic framework of Bangladesh and adjoining areas indicate that Bangladesh lies well within an active seismic zone. The after effect of earthquake is more severe in an underdeveloped and a densely populated country like ours than any other developed countries. Bangladesh National Building Code (BNBC) was first established in 1993 to provide guidelines for design and construction of new structure subject to earthquake ground motions in order to minimize the risk to life for all structures. A revision of BNBC 1993 is undergoing to make this up to date with other international building codes. This paper aims at the comparison of various provisions of seismic analysis as given in building codes of different countries. This comparison will give an idea regarding where our country stands when it comes to safety against earth quake. Primarily, various seismic parameters in BNBC 2010 (draft) have been studied and compared with that of BNBC 1993. Later, both 1993 and 2010 edition of BNBC codes have been compared graphically with building codes of other countries such as National Building Code of India 2005 (NBC-India 2005), American Society of Civil Engineering 7-05 (ASCE 7-05). The base shear/weight ratios have been plotted against the height of the building. The investigation in this paper reveals that BNBC 1993 has the least base shear among all the codes. Factored Base shear values of BNBC 2010 are found to have increased significantly than that of BNBC 1993 for low rise buildings (≤20 m) around the country than its predecessor. Despite revision of the code, BNBC 2010 (draft) still suggests less base shear values when compared to the Indian and American code. Therefore, this increase in factor of safety against the earthquake imposed by the proposed BNBC 2010 code by suggesting higher values of base shear is appreciable.

  14. Broadband analysis of landslides seismic signal : example of the Oso-Steelhead landslide and other recent events

    NASA Astrophysics Data System (ADS)

    Hibert, C.; Stark, C. P.; Ekstrom, G.

    2014-12-01

    Landslide failures on the scale of mountains are spectacular, dangerous, and spontaneous, making direct observations hard to obtain. Measurement of their dynamic properties during runout is a high research priority, but a logistical and technical challenge. Seismology has begun to help in several important ways. Taking advantage of broadband seismic stations, recent advances now allow: (i) the seismic detection and location of large landslides in near-real-time, even for events in very remote areas that may have remain undetected, such as the 2014 Mt La Perouse supraglacial failure in Alaska; (ii) inversion of long-period waves generated by large landslides to yield an estimate of the forces imparted by the bulk accelerating mass; (iii) inference of the landslide mass, its center-of-mass velocity over time, and its trajectory.Key questions persist, such as: What can the short-period seismic data tell us about the high-frequency impacts taking place within the granular flow and along its boundaries with the underlying bedrock? And how does this seismicity relate to the bulk acceleration of the landslide and the long-period seismicity generated by it?Our recent work on the joint analysis of short- and long-period seismic signals generated by past and recent events, such as the Bingham Canyon Mine and the Oso-Steelhead landslides, provides new insights to tackle these issues. Qualitative comparison between short-period signal features and kinematic parameters inferred from long-period surface wave inversion helps to refine interpretation of the source dynamics and to understand the different mechanisms for the origin of the short-period wave radiation. Our new results also suggest that quantitative relationships can be derived from this joint analysis, in particular between the short-period seismic signal envelope and the inferred momentum of the center-of-mass. In the future, these quantitative relationships may help to constrain and calibrate parameters used in

  15. The Effect of Analysis Methods on the Response of Steel-Braced Frame Buildings for Seismic Retrofitting

    NASA Astrophysics Data System (ADS)

    Ghodrati Amiri, G.; Hamidi Jamnani, H.; Mohebi, B.

    In this study, steel-braced frame buildings which are designed according to 2800 Standard of Iran (3rd revision), will be evaluated by four main types of structural analysis (Linear Static, Linear Dynamic, Nonlinear Static and Nonlinear Dynamic Analyses) with regard to Seismic Rehabilitation Code for Existing Buildings in Iran (based on FEMA 273). The discrepancy of the results derived from these four types of analysis and also seismic performance of the buildings in both linear and nonlinear treatments will be analyzed. At first, Probabilistic Seismic Hazard Analysis (PSHA) for 2 hazard levels has been carried out at center of Tehran, then three 3D models including 3 common buildings (5, 10 and 15-story) have been selected and designed subjected to earthquake according to 2800 Standard. Following this, these three 3D models have been analyzed and controlled based on Seismic Rehabilitation Code for Existing Buildings. The selected rehabilitation goal for this research is Fair (Controlling Life Safety in Hazard Level 1 + Collapse Prevention in Hazard Level 2). According to the results of this research, the accuracy of linear analysis for evaluating bracing elements is very low, in evaluating columns the results of linear static analysis is much more acceptable than linear dynamic and nonlinear static analysis. Also, by increasing the number of stories the accuracy of nonlinear static analysis decreases.

  16. A Bayesian method to mine spatial data sets to evaluate the vulnerability of human beings to catastrophic risk.

    PubMed

    Li, Lianfa; Wang, Jinfeng; Leung, Hareton; Zhao, Sisi

    2012-06-01

    Vulnerability of human beings exposed to a catastrophic disaster is affected by multiple factors that include hazard intensity, environment, and individual characteristics. The traditional approach to vulnerability assessment, based on the aggregate-area method and unsupervised learning, cannot incorporate spatial information; thus, vulnerability can be only roughly assessed. In this article, we propose Bayesian network (BN) and spatial analysis techniques to mine spatial data sets to evaluate the vulnerability of human beings. In our approach, spatial analysis is leveraged to preprocess the data; for example, kernel density analysis (KDA) and accumulative road cost surface modeling (ARCSM) are employed to quantify the influence of geofeatures on vulnerability and relate such influence to spatial distance. The knowledge- and data-based BN provides a consistent platform to integrate a variety of factors, including those extracted by KDA and ARCSM to model vulnerability uncertainty. We also consider the model's uncertainty and use the Bayesian model average and Occam's Window to average the multiple models obtained by our approach to robust prediction of the risk and vulnerability. We compare our approach with other probabilistic models in the case study of seismic risk and conclude that our approach is a good means to mining spatial data sets for evaluating vulnerability.

  17. Analysis of the seismic wavefield in the Moesian Platform (Bucharest area)

    NASA Astrophysics Data System (ADS)

    -Florinela Manea, Elena; Hobiger, Manuel-Thomas; Michel, Clotaire; Fäh, Donat; -Ortanza Cioflan, Carmen

    2016-04-01

    Bucharest is located in the center of the Moesian platform, in a large and deep sedimentary basin (450 km long, 300 km wide and in some places up to 20 km depth). During large earthquakes generated by the Vrancea seismic zone, located approximately 140 km to the North, the ground motion recorded in Bucharest area is characterized by predominant long periods and large amplification. This phenomenon has been explained by the influence of both source mechanism (azimuth and type of incident waves) and mechanical properties of the local structure (geological layering and geometry). The main goal of our study is to better characterize and understand the seismic wave field produced by earthquakes in the area of Bucharest. We want to identify the contribution of different seismic surface waves, such as the ones produced at the edges of the large sedimentary basin or multipath interference waves (Airy phases of Love and Rayleigh waves) to the ground motion. The data from a 35 km diameter array (URS experiment) installed by the National Institute for Earth Physics during 10 months in 2003 and 2004 in the urban area of Bucharest and adjacent zones was used. In order to perform the wave field characterization of the URS array, the MUSIQUE technique was used. This technique consists in a combination of the classical MUSIC and the quaternion-MUSIC algorithms and analyzes the three-component signals of all sensors of a seismic array together in order to analyze the Love and Rayleigh wave dispersion curves as well as the Rayleigh wave ellipticity curve. The analysis includes 20 regional earthquakes with Mw >3 and 5 teleseismic events with Mw> 7 that have enough energy at low frequency (0.1 - 1 Hz), i.e. in the resolution range of the array. For all events, the greatest energy is coming from the backazimuth of the source and the wave field is dominated by Love waves. The results of the array analyses clearly indicate a significant scattering corresponding to 2D or 3D effects in the

  18. Bayesian uncertainty analysis for advanced seismic imaging - Application to the Mentelle Basin, Australia

    NASA Astrophysics Data System (ADS)

    Michelioudakis, Dimitrios G.; Hobbs, Richard W.; Caiado, Camila C. S.

    2016-04-01

    multivariate posterior distribution. The novelty of our approach and the major difference compared to the traditional semblance spectrum velocity analysis procedure is the calculation of uncertainty of the output model. As the model is able to estimate the credibility intervals of the corresponding interval velocities, we can produce the most probable PSDM images in an iterative manner. The depths extracted using our statistical algorithm are in very good agreement with the key horizons retrieved from the drilled core DSDP-258, showing that the Bayesian model is able to control the depth migration of the seismic data and estimate the uncertainty to the drilling targets.

  19. Preliminary Analysis of the CASES GPS Receiver Performance during Simulated Seismic Displacements

    NASA Astrophysics Data System (ADS)

    De la Rosa-Perkins, A.; Reynolds, A.; Crowley, G.; Azeem, I.

    2014-12-01

    We explore the ability of a new GPS software receiver, called CASES (Connected Autonomous Space Environment Sensor), to measure seismic displacements in realtime. Improvements in GPS technology over the last 20 years allow for precise measurement of ground motion during seismic events. For example, GPS data has been used to calculate displacement histories at an earthquake's epicenter and fault slip estimations with great accuracy. This is supported by the ability to measure displacements directly using GPS, bypassing the double integration that accelerometers require, and by higher clipping limits than seismometers. The CASES receiver developed by ASTRA in collaboration with Cornell University and the University of Texas, Austin represents a new geodetic-quality software-based GPS receiver that measures ionospheric space weather in addition to the usual navigation solution. To demonstrate, in a controlled environment, the ability of the CASES receiver to measure seismic displacements, we simulated ground motions similar to those generated during earthquakes, using a shake box instrumented with an accelerometer and a GPS antenna. The accelerometer measured the box's actual displacement. The box moved on a manually controlled axis that underwent varied one-dimensional motions (from mm to cm) at different frequencies and amplitudes. The CASES receiver was configured to optimize the accuracy of the position solution. We quantified the CASES GPS receiver performance by comparing the GPS solutions against the accelerometer data using various statistical analysis methods. The results of these tests will be presented. The CASES receiver is designed with multiple methods of accessing the data in realtime, ranging from internet connection, blue-tooth, cell-phone modem and Iridium modem. Because the CASES receiver measures ionospheric space weather in addition to the usual navigation solution, CASES provides not only the seimic signal, but also the ionospheric space weather

  20. Thermal Analysis of the Vulnerability of the Spacesuit Battery Design to Short-Circuit Conditions (Presentation)

    SciTech Connect

    Kim, G. H.; Chaney, L.; Smith, K.; Pesaran, A.; Darcy, E.

    2010-04-22

    NREL researchers created a mathematical model of a full 16p-5s spacesuit battery for NASA that captures electrical/thermal behavior during shorts to assess the vulnerability of the battery to pack-internal (cell-external) shorts. They found that relocating the short from battery pack-external (experimental validation) to pack-internal (modeling study) causes substantial additional heating of cells, which can lead to cell thermal runaway. All three layers of the bank-to-bank separator must fail for the pack-internal short scenario to occur. This finding emphasizes the imperative of battery pack assembly cleanliness. The design is tolerant to pack-internal shorts when stored at 0% state of charge.

  1. Cyber Security for the Spaceport Command and Control System: Vulnerability Management and Compliance Analysis

    NASA Technical Reports Server (NTRS)

    Gunawan, Ryan A.

    2016-01-01

    With the rapid development of the Internet, the number of malicious threats to organizations is continually increasing. In June of 2015, the United States Office of Personnel Management (OPM) had a data breach resulting in the compromise of millions of government employee records. The National Aeronautics and Space Administration (NASA) is not exempt from these attacks. Cyber security is becoming a critical facet to the discussion of moving forward with projects. The Spaceport Command and Control System (SCCS) project at the Kennedy Space Center (KSC) aims to develop the launch control system for the next generation launch vehicle in the coming decades. There are many ways to increase the security of the network it uses, from vulnerability management to ensuring operating system images are compliant with securely configured baselines recommended by the United States Government.

  2. Vulnerability analysis of a pressurized aluminum composite vessel against hypervelocity impacts

    NASA Astrophysics Data System (ADS)

    Hereil, Pierre-Louis; Plassard, Fabien; Mespoulet, Jérôme

    2015-09-01

    Vulnerability of high pressure vessels subjected to high velocity impact of space debris is analyzed with the response of pressurized vessels to hypervelocity impact of aluminum sphere. Investigated tanks are CFRP (carbon fiber reinforced plastics) overwrapped Al vessels. Explored internal pressure of nitrogen ranges from 1 bar to 300 bar and impact velocity are around 4400 m/s. Data obtained from Xrays radiographies and particle velocity measurements show the evolution of debris cloud and shock wave propagation in pressurized nitrogen. Observation of recovered vessels leads to the damage pattern and to its evolution as a function of the internal pressure. It is shown that the rupture mode is not a bursting mode but rather a catastrophic damage of the external carbon composite part of the vessel.

  3. RESILIENCE IN VULNERABLE POPULATIONS WITH TYPE 2 DIABETES MELLITUS AND HYPERTENSION: A SYSTEMATIC REVIEW AND META-ANALYSIS

    PubMed Central

    Pesantes, M. Amalia; Lazo-Porras, María; Abu Dabrh, Abd Moain; Avila-Ramirez, Jaime R.; Caycho, Maria; Villamonte, Georgina Y.; Sanchez-Perez, Grecia P.; Málaga, Germán; Bernabé-Ortiz, Antonio; Miranda, J. Jaime

    2015-01-01

    Background Patients with chronic conditions and limited access to healthcare experience stressful challenges due to the burden of managing both their conditions and their daily life demands. Resilience provides a mechanism of adapting to stressful experiences. We conducted a systematic review and meta-analysis to synthesize the evidence about interventions to enhance resiliency in managing hypertension or type-2 diabetes in vulnerable populations, and to assess the efficacy of these interventions on clinical outcomes. Methods We searched multiple databases from early inception through February 2015 including randomized controlled trials that enrolled patients with type-2 diabetes or hypertension. All interventions that targeted resilience in vulnerable populations were included. Data were synthesized to describe the characteristics and efficacy of resilience interventions. We pooled the total effects by calculating standardized mean difference using the random-effects model. Results The final search yielded seventeen studies. All studies were conducted in the United States and generally targeted minority participants. Resiliency interventions used diverse strategies; discussion groups or workshops were the most common approach. Conclusions Interventions aimed at enhancing the resiliency of patients from vulnerable groups are diverse. Outcomes were not fully conclusive. There was some evidence that resilience interventions had a positive effect on HbA1c levels, but not blood pressure. The incorporation of resiliency-oriented interventions into the arsenal of prevention and management of chronic conditions appears to be an opportunity that remains to be better investigated and exploited, and there is need to pursue further understanding of the core components of any intervention that claims to enhance resilience. PMID:26239007

  4. Analysis of post-blasting source mechanisms of mining-induced seismic events in Rudna copper mine, Poland.

    NASA Astrophysics Data System (ADS)

    Caputa, Alicja; Rudzinski, Lukasz; Talaga, Adam

    2016-04-01

    Copper ore exploitation in the Lower Silesian Copper District, Poland (LSCD), is connected with many specific hazards. The most hazardous one is induced seismicity and rockbursts which follow strong mining seismic events. One of the most effective method to reduce seismic activity is blasting in potentially hazardous mining panels. This way, small to moderate tremors are provoked and stress accumulation is substantially reduced. This work presents an analysis of post-blasting events using Full Moment Tensor (MT) inversion at the Rudna mine, Poland using signals dataset recorded on underground seismic network. We show that focal mechanisms for events that occurred after blasts exhibit common features in the MT solution. The strong isotropic and small Double Couple (DC) component of the MT, indicate that these events were provoked by detonations. On the other hand, post-blasting MT is considerably different than the MT obtained for common strong mining events. We believe that seismological analysis of provoked and unprovoked events can be a very useful tool in confirming the effectiveness of blasting in seismic hazard reduction in mining areas.

  5. Nonlinear seismic analysis of a reactor structure impact between core components

    NASA Technical Reports Server (NTRS)

    Hill, R. G.

    1975-01-01

    The seismic analysis of the FFTF-PIOTA (Fast Flux Test Facility-Postirradiation Open Test Assembly), subjected to a horizontal DBE (Design Base Earthquake) is presented. The PIOTA is the first in a set of open test assemblies to be designed for the FFTF. Employing the direct method of transient analysis, the governing differential equations describing the motion of the system are set up directly and are implicitly integrated numerically in time. A simple lumped-nass beam model of the FFTF which includes small clearances between core components is used as a "driver" for a fine mesh model of the PIOTA. The nonlinear forces due to the impact of the core components and their effect on the PIOTA are computed.

  6. Nonlinear study of seismicity in the Mexican subduction zone by means of visual recurrence analysis

    NASA Astrophysics Data System (ADS)

    Ramirez Rojas, A.; Moreno-Torres, R. L.

    2012-12-01

    The subduction in the Mexican South Pacific coast might be approximated as a subhorizontal slab bounded at the edge by the steep subduction geometry of the Cocos plate beneath the Caribbean plate to the east and of the Rivera plate beneath North America to the west. Singh et al. (1983), reported a study that takes into account the geometry of the subducted Rivera and Cocos plates beneath the North American lithosphere defining, according their geometry, four regions: Jalisco, Michoacán, Guerrero and Oaxaca. In this work we study the seismicity occurred in Mexico, for each region, by means of the visual recurrence analysis (VRA). Our analysis shows important differences between each region that could be associated with nonlinear dynamical properties of each region. Singh, S.K., M. Rodriguez, and L. Esteva (1983), Statistics of small earthquakes and frequency of occurrence of large earthquakes along the Mexican subduction zone, Bull. Seismol. Soc. Am. 73, 6A, 1779-1796.

  7. Nonlinear analysis of the dynamics in the Mexican Pacific seismic region by using visual recurrence plots.

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, Alejandro; Moreno-Torres, Lucía; Flores-Márquez, Elsa

    2013-04-01

    The subduction in the Mexican South Pacific coast might be approximated as a subhorizontal slab bounded at the edge by the steep subduction geometry of the Cocos plate beneath the Caribbean plate to the east and of the Rivera plate beneath North America to the west. It has been reported a study that takes into account the geometry of the subducted Rivera and Cocos plates beneath the North American lithosphere defining, according their geometry, four regions: Jalisco, Michoacán, Guerrero and Oaxaca. By means of the visual recurrence analysis (VRA), in this work we study some dynamical features of the seismicity occurred for each region, Our analysis shows interesting differences among the recurrence plots of each region indicating a possible correlation between the subduction geometry and the nonlinear dynamical properties of each region.

  8. Facility Environmental Vulnerability Assessment

    SciTech Connect

    Van Hoesen, S.D.

    2001-07-09

    From mid-April through the end of June 2001, a Facility Environmental Vulnerability Assessment (FEVA) was performed at Oak Ridge National Laboratory (ORNL). The primary goal of this FEVA was to establish an environmental vulnerability baseline at ORNL that could be used to support the Laboratory planning process and place environmental vulnerabilities in perspective. The information developed during the FEVA was intended to provide the basis for management to initiate immediate, near-term, and long-term actions to respond to the identified vulnerabilities. It was expected that further evaluation of the vulnerabilities identified during the FEVA could be carried out to support a more quantitative characterization of the sources, evaluation of contaminant pathways, and definition of risks. The FEVA was modeled after the Battelle-supported response to the problems identified at the High Flux Beam Reactor at Brookhaven National Laboratory. This FEVA report satisfies Corrective Action 3A1 contained in the Corrective Action Plan in Response to Independent Review of the High Flux Isotope Reactor Tritium Leak at the Oak Ridge National Laboratory, submitted to the Department of Energy (DOE) ORNL Site Office Manager on April 16, 2001. This assessment successfully achieved its primary goal as defined by Laboratory management. The assessment team was able to develop information about sources and pathway analyses although the following factors impacted the team's ability to provide additional quantitative information: the complexity and scope of the facilities, infrastructure, and programs; the significantly degraded physical condition of the facilities and infrastructure; the large number of known environmental vulnerabilities; the scope of legacy contamination issues [not currently addressed in the Environmental Management (EM) Program]; the lack of facility process and environmental pathway analysis performed by the accountable line management or facility owner; and poor

  9. Seismic assessment of buried pipelines

    SciTech Connect

    Al-Chaar, G.; Brady, P.; Fernandez, G.

    1995-12-31

    A structure and its lifelines are closely linked because the disruption of lifeline systems will obstruct emergency service functions that are vitally needed after an earthquake. As an example of the criticality of these systems, the Association of Bay Area Government (ABAG) recorded thousands of leaks in pipelines that resulted in more than twenty million gallons of hazardous materials being released in several recorded earthquakes. The cost of cleaning the spills from these materials was very high. This information supports the development of seismic protection of lifeline systems. The US Army Corps of Engineers Construction Engineering Research Laboratories (USACERL) has, among its missions, the responsibility to develop seismic vulnerability assessment procedures for military installations. Within this mission, a preliminary research program to assess the seismic vulnerability of buried pipeline systems on military installations was initiated. Phase 1 of this research project resulted in two major studies. In the first, evaluating current procedures to seismically design or evaluate existing lifeline systems, the authors found several significant aspects that deserve special consideration and need to be addressed in future research. The second was focused on identifying parameters related to buried pipeline system vulnerability and developing a generalized analytical method to relate these parameters to the seismic vulnerability assessment of existing pipeline systems.

  10. Seismic anisotropy of northeastern Algeria from shear-wave splitting analysis

    NASA Astrophysics Data System (ADS)

    Radi, Zohir; Yelles-Chaouche, Abdelkrim; Bokelmann, Götz

    2015-11-01

    There are few studies of internal deformation under northern Africa; here we present such a study. We analyze teleseismic shear-wave splitting for northeast Algeria, to improve our knowledge of lithospheric and asthenospheric deformation mechanisms in this region. We study waveform data generated by tens of teleseismic events recorded at five recently installed broadband (BB) stations in Algeria. These stations cover an area 2° across, extending from the Tellian geological units in the North to the Saharan Atlas units in the South. Analysis of SKS-wave splitting results insignificant spatial variations in fast polarization orientation, over a scale length of at most 100 km. The seismic anisotropy shows three clear spatial patterns. A general ENE-WSW orientation is observed under the stations in the north. This polarization orientation follows the direction of the Tell Atlas mountain chain, which is perpendicular to the convergence direction between Africa and Eurasia. Delay times vary significantly across the region, between 0.6 and 2.0 s. At several stations there is an indication of a WNW-ESE polarization orientation, which is apparently related to a later geodynamic evolutionary phase in this region. A third pattern of seismic anisotropy emerges in the South, with an orientation of roughly N-S. We discuss these observations in light of geodynamic models and present-day geodetic motion.

  11. Seismic joint analysis for non-destructive testing of asphalt and concrete slabs

    USGS Publications Warehouse

    Ryden, N.; Park, C.B.

    2005-01-01

    A seismic approach is used to estimate the thickness and elastic stiffness constants of asphalt or concrete slabs. The overall concept of the approach utilizes the robustness of the multichannel seismic method. A multichannel-equivalent data set is compiled from multiple time series recorded from multiple hammer impacts at progressively different offsets from a fixed receiver. This multichannel simulation with one receiver (MSOR) replaces the true multichannel recording in a cost-effective and convenient manner. A recorded data set is first processed to evaluate the shear wave velocity through a wave field transformation, normally used in the multichannel analysis of surface waves (MASW) method, followed by a Lambwave inversion. Then, the same data set is used to evaluate compression wave velocity from a combined processing of the first-arrival picking and a linear regression. Finally, the amplitude spectra of the time series are used to evaluate the thickness by following the concepts utilized in the Impact Echo (IE) method. Due to the powerful signal extraction capabilities ensured by the multichannel processing schemes used, the entire procedure for all three evaluations can be fully automated and results can be obtained directly in the field. A field data set is used to demonstrate the proposed approach.

  12. Analysis of the seismicity activity of the volcano Ceboruco, Nayarit, Mexico

    NASA Astrophysics Data System (ADS)

    Rodriguez-Ayala, N. A.; Nunez-Cornu, F. J.; Escudero, C. R.; Zamora-Camacho, A.; Gomez, A.

    2014-12-01

    The Ceboruco is a stratovolcano is located in the state of Nayarit,Mexico (104 ° 30'31 .25 "W, 21 ° 7'28 .35" N, 2280msnm). This is an volcano active, as part of the Trans-Mexican Volcanic Belt, Nelson (1986) reports that it has had activity during the last 1000 years has averaged eruptions every 125 years or so, having last erupted in 1870, currently has fumarolic activity. In the past 20 years there has been an increase in the population and socio-economic activities around the volcano (Suárez Plascencia, 2013); which reason the Ceboruco study has become a necessity in several ways. Recent investigations of seismicity (Rodríguez Uribe et al., 2013) have classified the earthquakes in four families Ceboruco considering the waveform and spectral features. We present analysis included 57 days of seismicity from March to October 2012, in the period we located 97 events with arrivals of P and S waves clear, registered in at least three seasons, three components of the temporal network Ceboruco volcano.

  13. Damage detection and quantification in a structural model under seismic excitation using time-frequency analysis

    NASA Astrophysics Data System (ADS)

    Chan, Chun-Kai; Loh, Chin-Hsiung; Wu, Tzu-Hsiu

    2015-04-01

    In civil engineering, health monitoring and damage detection are typically carry out by using a large amount of sensors. Typically, most methods require global measurements to extract the properties of the structure. However, some sensors, like LVDT, cannot be used due to in situ limitation so that the global deformation remains unknown. An experiment is used to demonstrate the proposed algorithms: a one-story 2-bay reinforce concrete frame under weak and strong seismic excitation. In this paper signal processing techniques and nonlinear identification are used and applied to the response measurements of seismic response of reinforced concrete structures subject to different level of earthquake excitations. Both modal-based and signal-based system identification and feature extraction techniques are used to study the nonlinear inelastic response of RC frame using both input and output response data or output only measurement. From the signal-based damage identification method, which include the enhancement of time-frequency analysis of acceleration responses and the estimation of permanent deformation using directly from acceleration response data. Finally, local deformation measurement from dense optical tractor is also use to quantify the damage of the RC frame structure.

  14. Seismic Hazard Analysis of Aizawl, India with a Focus on Water System Fragilities

    NASA Astrophysics Data System (ADS)

    Belair, G. M.; Tran, A. J.; Dreger, D. S.; Rodgers, J. E.

    2015-12-01

    GeoHazards International (GHI) has partnered with the University of California, Berkeley in a joint Civil Engineering and Earth Science summer internship program to investigate geologic hazards. This year the focus was on Aizawl, the capital of India's Mizoram state, situated on a ridge in the Burma Ranges. Nearby sources have the potential for large (M > 7) earthquakes that would be devastating to the approximately 300,000 people living in the city. Earthquake induced landslides also threaten the population as well as the city's lifelines. Fieldwork conducted in June 2015 identified hazards to vital water system components. The focus of this abstract is a review of the seismic hazards that affect Aizawl, with special attention paid to water system locations. To motivate action to reduce risk, GHI created an earthquake scenario describing effects of a M7 right-lateral strike-slip intraplate earthquake occurring 30 km below the city. We extended this analysis by exploring additional mapped faults as well as hypothetical blind reverse faults in terms of PGA, PGV, and PSA. Ground motions with hanging wall and directivity effects were also examined. Several attenuation relationships were used in order to assess the uncertainty in the ground motion parameters. Results were used to determine the likely seismic performance of water system components, and will be applied in future PSHA studies.

  15. Passive-performance, analysis, and upgrades of a 1-ton seismic attenuation system

    NASA Astrophysics Data System (ADS)

    Bergmann, G.; Mow-Lowry, C. M.; Adya, V. B.; Bertolini, A.; Hanke, M. M.; Kirchhoff, R.; Köhlenbeck, S. M.; Kühn, G.; Oppermann, P.; Wanner, A.; Westphal, T.; Wöhler, J.; Wu, D. S.; Lück, H.; Strain, K. A.; Danzmann, K.

    2017-03-01

    The 10 m prototype facility at the Albert–Einstein-institute (AEI) in Hanover, Germany, employs three large seismic attenuation systems to reduce mechanical motion. The AEI seismic-attenuation-system (AEI-SAS) uses mechanical anti-springs in order to achieve resonance frequencies below 0.5 Hz. This system provides passive isolation from ground motion by a factor of about 400 in the horizontal direction at 4 Hz and in the vertical direction at 9 Hz. The presented isolation performance is measured under vacuum conditions using a combination of commercial and custom-made inertial sensors. Detailed analysis of this performance led to the design and implementation of tuned dampers to mitigate the effect of the unavoidable higher order modes of the system. These dampers reduce RMS motion substantially in the frequency range between 10 and 100 Hz in 6 degrees of freedom. The results presented here demonstrate that the AEI-SAS provides substantial passive isolation at all the fundamental mirror-suspension resonances.

  16. Flood Vulnerability Analysis of the part of Karad Region, Satara District, Maharashtra using Remote Sensing and Geographic Information System technique

    NASA Astrophysics Data System (ADS)

    Warghat, Sumedh R.; Das, Sandipan; Doad, Atul; Mali, Sagar; Moon, Vishal S.

    2012-07-01

    Karad City is situated on the bank of confluence of river Krishna & Koyana, which is severely flood prone area. The floodwaters enter the city through the roads and disrupt the infrastructure in the whole city. Furthermore, due to negligence of the authorities and unplanned growth of the city, the people living in the city have harnessed the natural flow of water by constructing unnecessary embankments in the river Koyna. Due to this reason now river koyna is flowing in the form of a narrow channel, which very easily over-flows during very minor flooding.Flood Vulnerabilty Analysis has been done for the karad region of satara district, maharashtra using remote sensing and geographic information system technique. The aim of this study is to identify flood vulnerability zone by using GIS and RS technique and an attempt has been to demonstrat the application of remote sensing and GIS in order to map flood vulnerabilty area by utilizing ArcMap, and Erdas software. Flood vulnerabilty analysis of part the Karad Regian of Satara District, Maharashtra has been carried out with the objectives - Identify the Flood Prone area in the Koyana and Krishna river basin, Calculate surface runoff and Delineate flood sensitive areas. Delineate classified hazard Map, Evaluate the Flood affected area, Prepare the Flood Vulnerability Map by utilizing Remote Sensing and GIS technique. (C.J. Kumanan;S.M. Ramasamy)The study is based on GIS and spatial technique is used for analysis and understanding of flood problem in Karad Tahsil. The flood affected areas of the different magnitude has been identified and mapped using Arc GIS software. The analysis is useful for local planning authority for identification of risk areas and taking proper decision in right moment. In the analysis causative factors for flooding in watershed are taken into account as annual rainfall, size of watershed, basin slope, drainage density of natural channels and land use. (Dinand Alkema; Farah Aziz.)This study of

  17. Vulnerability assessment at a national level in Georgia

    NASA Astrophysics Data System (ADS)

    Tsereteli, N.; Arabidze, V.; Varazanashvili, O.; Gugeshashvili, T.

    2012-04-01

    Vulnerability assessment at a national level in Georgia Nino Tsereteli, Vakhtang Arabidze, Otar Varazanashvili, Tengiz Gugeshashvili The risk always exists when cities are built on. Population growth in cities and urbanization in natural hazard-prone zones leads to infrastructure expansion. The goal of the society is to construct natural hazards resistant infrastructure and minimize the expected losses. This is a complicated task as there is always knowledge deficiency on real seismic hazard and vulnerability. Assessment of vulnerability is vital in risk analysis, as vulnerability is defined in many different ways. Work presented here mostly deals with assessment of infrastructure's and population vulnerability at national level in Georgia. This work was initiated by NATO SFP project "seismic Hazard and Risk Assessment for Southern Caucasus - Eastern Turkey Energy Corridors" and the two work packages WP4 (seismic risk) and WP5 (city scenarios) of risk module of EMME (Earthquake Model of the Middle East Region) project. First step was creation databases (inventory) of elements at risk in GIS. Element at risk were the buildings, population, pipelines. The inventories was studied and Created in GIS for the following categories: Building material, number of stories, number of entrances, condition of building, building period. For pipelines pipe tipe (continous or segmented), material, pipe diameter. Very important is to estimate the initial cost of building for assessment of economic losses. From this purpose the attempt was done and the algorithm of this estimation were prepared taking into account obtained the inventory. Build quality, reliability and durability are of special importance to corresponding state agencies and include different aesthetic, engineering, practical, social, technological and economical aspects. The necessity that all of these aspects satisfy existing normative requirements becomes evident as the building and structures come into exploitation

  18. Spatial analysis and modeling to assess and map current vulnerability to extreme weather events in the Grijalva - Usumacinta watershed, México

    NASA Astrophysics Data System (ADS)

    López L, D.

    2009-11-01

    One of the major concerns over a potential change in climate is that it will cause an increase in extreme weather events. In Mexico, the exposure factors as well as the vulnerability to the extreme weather events have increased during the last three or four decades. In this study spatial analysis and modeling were used to assess and map settlement and crop systems vulnerability to extreme weather events in the Grijalva - Usumacinta watershed. Sensitivity and coping adaptive capacity maps were constructed using decision models; these maps were then combined to produce vulnerability maps. The most vulnerable area in terms of both settlement and crop systems is the highlands, where the sensitivity is high and the adaptive capacity is low. In lowlands, despite the very high sensitivity, the higher adaptive capacity produces only moderate vulnerability. I conclude that spatial analysis and modeling are powerful tools to assess and map vulnerability. These preliminary results can guide the formulation of adaptation policies to an increasing risk of extreme weather events.

  19. Co-seismic landslide topographic analysis based on multi-temporal DEM-A case study of the Wenchuan earthquake.

    PubMed

    Ren, Zhikun; Zhang, Zhuqi; Dai, Fuchu; Yin, Jinhui; Zhang, Huiping

    2013-01-01

    Hillslope instability has been thought to be one of the most important factors for landslide susceptibility. In this study, we apply geomorphic analysis using multi-temporal DEM data and shake intensity analysis to evaluate the topographic characteristics of the landslide areas. There are many geomorphologic analysis methods such as roughness, slope aspect, which are also as useful as slope analysis. The analyses indicate that most of the co-seismic landslides occurred in regions with roughness, hillslope and slope aspect of >1.2, >30, and between 90 and 270, respectively. However, the intersection regions from the above three methods are more accurate than that derived by applying single topographic analysis method. The ground motion data indicates that the co-seismic landslides mainly occurred on the hanging wall side of Longmen Shan Thrust Belt within the up-down and horizontal peak ground acceleration (PGA) contour of 150 PGA and 200 gal, respectively. The comparisons of pre- and post-earthquake DEM data indicate that the medium roughness and slope increased, the roughest and steepest regions decreased after the Wenchuan earthquake. However, slope aspects did not even change. Our results indicate that co-seismic landslides mainly occurred at specific regions of high roughness, southward and steep sloping areas under strong ground motion. Co-seismic landslides significantly modified the local topography, especially the hillslope and roughness. The roughest relief and steepest slope are significantly smoothed; however, the medium relief and slope become rougher and steeper, respectively.

  20. Investigation of Nonlinear Site Response and Seismic Compression from Case History Analysis and Laboratory Testing

    NASA Astrophysics Data System (ADS)

    Yee, Eric

    In this thesis I address a series of issues related to ground failure and ground motions during earthquakes. A major component is the evaluation of cyclic volumetric strain behavior of unsaturated soils, more commonly known as seismic compression, from advanced laboratory testing. Another major component is the application of nonlinear and equivalent linear ground response analyses to large-strain problems involving highly nonlinear dynamic soil behavior. These two components are merged in the analysis of a truly unique and crucial field case history of nonlinear site response and seismic compression. My first topic concerns dynamic soil testing for relatively small strain dynamic soil properties such as threshold strains, gammatv. Such testing is often conducted using specialized devices such as dual-specimen simple-shear, as devices configured for large strain testing produce noisy signals in the small strain range. Working with a simple shear device originally developed for large-strain testing, I extend its low-strain capabilities by characterizing noisy signals and utilizing several statistical methods to extract meaningful responses in the small strain range. I utilize linear regression of a transformed variable to estimate the cyclic shear strain from a noisy signal and the confidence interval on its amplitude. I utilize Kernel regression with the Nadaraya-Watson estimator and a Gaussian kernel to evaluate vertical strain response. A practical utilization of these techniques is illustrated by evaluating threshold shear strains for volume change with a procedure that takes into account uncertainties in the measured shear and vertical strains. My second topic concerns the seismic compression characteristics of non-plastic and low-plasticity silty sands with varying fines content (10 ≤ FC ≤ 60%). Simple shear testing was performed on various sand-fines mixtures at a range of modified Proctor relative compaction levels ( RC) and degrees-of-saturation (S

  1. Seismic issues at the Paducah Gaseous Diffusion Plant

    SciTech Connect

    Fricke, K.E. )

    1989-11-01

    A seismic expert workshop was held at the Paducah Gaseous Diffusion Plant (PGDP) on March 13--15, 1989. the PGDP is operated by Martin Marietta Energy Systems, Inc. for the United States Department of Energy (DOE). During the last twenty years the design criteria for natural phenomenon hazards has steadily become more demanding at all of the DOE Oak Ridge Operations (ORO) sites. The purpose of the two-day workshop was to review the seismic vulnerability issues of the PGDP facilities. Participants to the workshop included recognized experts in the fields of seismic engineering, seismology and geosciences, and probabilistic analysis, along with engineers and other personnel from Energy Systems. A complete list of the workshop participants is included in the front of this report. 29 refs.

  2. Seismic Analysis of Magmatism in the Galapagos Archipelago and East Africa

    NASA Astrophysics Data System (ADS)

    Tepp, Gabrielle

    Magmatism and deformation are consequences of fundamental processes shaping Earth's ˜150 km-thick continental and <125 km-thick oceanic plates. Earthquake seismology encompasses many methods to detect compositional and thermal boundaries from Earth's surface to the dynamic mantle driving plate tectonics. This work uses three different seismic methods to probe magma migration and storage and tectonism in two intraplate hotspot provinces: the Galapagos and East Africa. First, seismic body-wave tomography is used to image magma within oceanic crust of the largest Galapagos volcano, Sierra Negra. A laterally large, low-velocity region with many smaller, high-magnitude velocity anomalies is imaged at 8--15.5 km depths. No sharp seismic velocity increase is imaged within the resolvable depths, indicating that the thickened crust is at least 16 km deep. The second study involves a spectral analysis of earthquakes induced by the intrusion of thin sheets of magma rising beneath the Afar rift, East Africa. Earthquakes have varying spectral content, some with unusually large amplitude low-frequency content and enhanced surface waves. The analysis showed no clear boundaries between spectral types, suggesting that they are all primarily the result of brittle failure. Deep dike segments (tops > 3 km) induce only high-frequency volcano-tectonic earthquakes, while shallower dike segments induce the full range of spectral types. This suggests that low-frequency content is a result of shallow hypocenters, with path and site effects, surface ruptures, and dike fluid interactions all possible secondary causes. In the final study, shear-wave splitting analysis of teleseismic body-wave phases is conducted to evaluate strain and crack fabrics at the base of the continental plate as a consequence of magmatism, mantle flow, and plate stretching in the Western rift, East Africa. On average, fast directions are northeast, consistent with geodynamic models of mantle flow from the African

  3. HANFORD DST THERMAL & SEISMIC PROJECT ANSYS BENCHMARK ANALYSIS OF SEISMIC INDUCED FLUID STRUCTURE INTERACTION IN A HANFORD DOUBLE SHELL PRIMARY TANK

    SciTech Connect

    MACKEY, T.C.

    2006-03-14

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project-DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The overall seismic analysis of the DSTs is being performed with the general-purpose finite element code ANSYS. The overall model used for the seismic analysis of the DSTs includes the DST structure, the contained waste, and the surrounding soil. The seismic analysis of the DSTs must address the fluid-structure interaction behavior and sloshing response of the primary tank and contained liquid. ANSYS has demonstrated capabilities for structural analysis, but the capabilities and limitations of ANSYS to perform fluid-structure interaction are less well understood. The purpose of this study is to demonstrate the capabilities and investigate the limitations of ANSYS for performing a fluid-structure interaction analysis of the primary tank and contained waste. To this end, the ANSYS solutions are benchmarked against theoretical solutions appearing in BNL 1995, when such theoretical solutions exist. When theoretical solutions were not available, comparisons were made to theoretical solutions of similar problems and to the results from Dytran simulations. The capabilities and limitations of the finite element code Dytran for performing a fluid-structure interaction analysis of the primary tank and contained waste were explored in a parallel investigation (Abatt 2006). In conjunction with the results of the global ANSYS analysis

  4. Physically based probabilistic seismic hazard analysis using broadband ground motion simulation: a case study for the Prince Islands Fault, Marmara Sea

    NASA Astrophysics Data System (ADS)

    Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali

    2016-08-01

    The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.

  5. Advanced analysis of complex seismic waveforms to characterize the subsurface Earth structure

    NASA Astrophysics Data System (ADS)

    Jia, Tianxia

    2011-12-01

    This thesis includes three major parts, (1) Body wave analysis of mantle structure under the Calabria slab, (2) Spatial Average Coherency (SPAC) analysis of microtremor to characterize the subsurface structure in urban areas, and (3) Surface wave dispersion inversion for shear wave velocity structure. Although these three projects apply different techniques and investigate different parts of the Earth, their aims are the same, which is to better understand and characterize the subsurface Earth structure by analyzing complex seismic waveforms that are recorded on the Earth surface. My first project is body wave analysis of mantle structure under the Calabria slab. Its aim is to better understand the subduction structure of the Calabria slab by analyzing seismograms generated by natural earthquakes. The rollback and subduction of the Calabrian Arc beneath the southern Tyrrhenian Sea is a case study of slab morphology and slab-mantle interactions at short spatial scale. I analyzed the seismograms traversing the Calabrian slab and upper mantle wedge under the southern Tyrrhenian Sea through body wave dispersion, scattering and attenuation, which are recorded during the PASSCAL CAT/SCAN experiment. Compressional body waves exhibit dispersion correlating with slab paths, which is high-frequency components arrivals being delayed relative to low-frequency components. Body wave scattering and attenuation are also spatially correlated with slab paths. I used this correlation to estimate the positions of slab boundaries, and further suggested that the observed spatial variation in near-slab attenuation could be ascribed to mantle flow patterns around the slab. My second project is Spatial Average Coherency (SPAC) analysis of microtremors for subsurface structure characterization. Shear-wave velocity (Vs) information in soil and rock has been recognized as a critical parameter for site-specific ground motion prediction study, which is highly necessary for urban areas located

  6. Seismic texture and amplitude analysis of large scale fluid escape pipes using time lapses seismic surveys: examples from the Loyal Field (Scotland, UK)

    NASA Astrophysics Data System (ADS)

    Maestrelli, Daniele; Jihad, Ali; Iacopini, David; Bond, Clare

    2016-04-01

    Fluid escape pipes are key features of primary interest for the analysis of vertical fluid flow and secondary hydrocarbon migration in sedimentary basin. Identified worldwide (Løset et al., 2009), they acquired more and more importance as they represent critical pathways for supply of methane and potential structure for leakage into the storage reservoir (Cartwright & Santamarina, 2015). Therefore, understanding their genesis, internal characteristics and seismic expression, is of great significance for the exploration industry. Here we propose a detailed characterization of the internal seismic texture of some seal bypass system (e.g fluid escape pipes) from a 4D seismic survey (released by the BP) recently acquired in the Loyal Field. The seal by pass structure are characterized by big-scale fluid escape pipes affecting the Upper Paleogene/Neogene stratigraphic succession in the Loyal Field, Scotland (UK). The Loyal field, is located on the edge of the Faroe-Shetland Channel slope, about 130 km west of Shetland (Quadrants 204/205 of the UKCS) and has been recently re-appraised and re developed by a consortium led by BP. The 3D detailed mapping analysis of the full and partial stack survey (processed using amplitude preservation workflows) shows a complex system of fluid pipe structure rooted in the pre Lista formation and developed across the paleogene and Neogene Units. Geometrical analysis show that pipes got diameter varying between 100-300 m and a length of 500 m to 2 km. Most pipes seem to terminate abruptly at discrete subsurface horizons or in diffuse termination suggesting multiple overpressured events and lateral fluid migration (through Darcy flows) across the overburden units. The internal texture analysis of the large pipes, (across both the root and main conduit zones), using near, medium and far offset stack dataset (processed through an amplitude preserved PSTM workflow) shows a tendency of up-bending of reflection (rather than pulls up artefacts

  7. The offshore Yangsan fault activity in the Quaternary, SE Korea: Analysis of high-resolution seismic profiles

    NASA Astrophysics Data System (ADS)

    Kim, Han-Joon; Moon, Seonghoon; Jou, Hyeong-Tae; Lee, Gwang Hoon; Yoo, Dong Geun; Lee, Sang Hoon; Kim, Kwang Hee

    2016-12-01

    The NNE-trending dextral Yangsan fault is a > 190-km-long structure in the Korean Peninsula traced to the southeastern coast. The scarcity of Quaternary deposits onland precludes any detailed investigation of the Quaternary activity and structure of the Yangsan fault using seismic reflection profiling. We acquired offshore high-resolution seismic profiles to investigate the extension of the Yangsan fault and constrain its Quaternary activity using stratigraphic markers. The seismic profiles reveal a NNE-trending fault system consisting of a main fault and an array of subsidiary faults that displaced Quaternary sequences. Stratigraphic analysis of seismic profiles indicates that the offshore faults were activated repeatedly in the Quaternary. The up-to-the-east sense of throw on the main fault and plan-view pattern of the fault system are explained by dextral strike-slip faulting. The main fault, when projected toward the Korean Peninsula along its strike, aligns well with the Yangsan fault. We suggest that the offshore fault system is a continuation of the Yangsan fault and has spatial correlation with weak but ongoing seismicity.

  8. ELIMINATING CONSERVATISM IN THE PIPING SYSTEM ANALYSIS PROCESS THROUGH APPLICATION OF A SUITE OF LOCALLY APPROPRIATE SEISMIC INPUT MOTIONS

    SciTech Connect

    Anthony L. Crawford; Robert E. Spears, Ph.D.; Mark J. Russell

    2009-07-01

    Seismic analysis is of great importance in the evaluation of nuclear systems due to the heavy influence such loading has on their designs. Current Department of Energy seismic analysis techniques for a nuclear safety-related piping system typically involve application of a single conservative seismic input applied to the entire system [1]. A significant portion of this conservatism comes from the need to address the overlapping uncertainties in the seismic input and in the building response that transmits that input motion to the piping system. The approach presented in this paper addresses these two sources of uncertainty through the application of a suite of 32 input motions whose collective performance addresses the total uncertainty while each individual motion represents a single variation of it. It represents an extension of the soil-structure interaction analysis methodology of SEI/ASCE 43-05 [2] from the structure to individual piping components. Because this approach is computationally intensive, automation and other measures have been developed to make such an analysis efficient. These measures are detailed in this paper.

  9. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT INCREASED LIQUID LEVEL ANALYSIS FOR 241-AP TANK FARMS

    SciTech Connect

    MACKEY TC; DEIBLER JE; JOHNSON KI; PILLI SP; KARRI NK; RINKER MW; ABATT FG; CARPENTER BG

    2007-02-16

    The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the SDT System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

  10. A Gis Model Application Supporting The Analysis of The Seismic Hazard For The Urban Area of Catania (italy)

    NASA Astrophysics Data System (ADS)

    Grasso, S.; Maugeri, M.

    After the Summit held in Washington on August 20-22 2001 to plan the first World Conference on the mitigation of Natural Hazards, a Group for the analysis of Natural Hazards within the Mediterranean area has been formed. The Group has so far determined the following hazards: (1) Seismic hazard (hazard for historical buildings included); (2) Hazard linked to the quantity and quality of water; (3) Landslide hazard; (4) Volcanic hazard. The analysis of such hazards implies the creation and the management of data banks, which can only be used if the data are properly geo-settled to allow a crossed use of them. The obtained results must be therefore represented on geo-settled maps. The present study is part of a research programme, namely "Detailed Scenarios and Actions for Seismic Prevention of Damage in the Urban Area of Catania", financed by the National Department for the Civil Protection and the National Research Council-National Group for the Defence Against Earthquakes (CNR-GNDT). Nowadays the south-eastern area of Sicily, called the "Iblea" seismic area of Sicily, is considered as one of the most intense seismic zones in Italy, based on the past and current seismic history and on the typology of civil buildings. Safety against earthquake hazards has two as pects: structural safety against potentially destructive dynamic forces and site safety related to geotechnical phenomena such as amplification, land sliding and soil liquefaction. So the correct evaluation of seismic hazard is highly affected by risk factors due to geological nature and geotechnical properties of soils. The effect of local geotechnical conditions on damages suffered by buildings under seismic conditions has been widely recognized, as it is demonstrated by the Manual for Zonation on Seismic Geotechnical Hazards edited by the International Society for Soil Mechanics and Geotechnical Engineering (TC4, 1999). The evaluation of local amplification effects may be carried out by means of either

  11. Network topology, Transport dynamics, and Vulnerability Analysis in River Deltas: A Graph-Theoretic Approach

    NASA Astrophysics Data System (ADS)

    Tejedor, A.; Foufoula-Georgiou, E.; Longjas, A.; Zaliapin, I. V.

    2014-12-01

    River deltas are intricate landscapes with complex channel networks that self-organize to deliver water, sediment, and nutrients from the apex to the delta top and eventually to the coastal zone. The natural balance of material and energy fluxes which maintains a stable hydrologic, geomorphologic, and ecological state of a river delta, is often disrupted by external factors causing topological and dynamical changes in the delta structure and function. A formal quantitative framework for studying river delta topology and transport dynamics and their response to change is lacking. Here we present such a framework based on spectral graph theory and demonstrate its value in quantifying the complexity of the delta network topology, computing its steady state fluxes, and identifying upstream (contributing) and downstream (nourishment) areas from any point in the network. We use this framework to construct vulnerability maps that quantify the relative change of sediment and water delivery to the shoreline outlets in response to possible perturbations in hundreds of upstream links. This enables us to evaluate which links (hotspots) and what management scenarios would most influence flux delivery to the outlets, paving the way of systematically examining how local or spatially distributed delta interventions can be studied within a systems approach for delta sustainability.

  12. Shared vision, shared vulnerability: A content analysis of corporate social responsibility information on tobacco industry websites.

    PubMed

    McDaniel, Patricia A; Cadman, Brie; Malone, Ruth E

    2016-08-01

    Tobacco companies rely on corporate social responsibility (CSR) initiatives to improve their public image and advance their political objectives, which include thwarting or undermining tobacco control policies. For these reasons, implementation guidelines for the World Health Organization's Framework Convention on Tobacco Control (FCTC) recommend curtailing or prohibiting tobacco industry CSR. To understand how and where major tobacco companies focus their CSR resources, we explored CSR-related content on 4 US and 4 multinational tobacco company websites in February 2014. The websites described a range of CSR-related activities, many common across all companies, and no programs were unique to a particular company. The websites mentioned CSR activities in 58 countries, representing nearly every region of the world. Tobacco companies appear to have a shared vision about what constitutes CSR, due perhaps to shared vulnerabilities. Most countries that host tobacco company CSR programs are parties to the FCTC, highlighting the need for full implementation of the treaty, and for funding to monitor CSR activity, replace industry philanthropy, and enforce existing bans.

  13. Fractal simulation of urbanization for the analysis of vulnerability to natural hazards

    NASA Astrophysics Data System (ADS)

    Puissant, Anne; Sensier, Antoine; Tannier, Cécile; Malet, Jean-Philippe

    2016-04-01

    Since 50 years, mountain areas are affected by important land cover/use changes characterized by the decrease of pastoral activities, reforestation and urbanization with the development of tourism activities and infrastructures. These natural and anthropogenic transformations have an impact on the socio-economic activities but also on the exposure of the communities to natural hazards. In the context of the ANR Project SAMCO which aims at enhancing the overall resilience of societies on the impacts of mountain risks, the objective of this research was to help to determine where to locate new residential developments for different scenarios of land cover/use (based on the Prelude European Project) for the years 2030 and 2050. The Planning Support System (PSS), called MUP-City, based on a fractal multi-scale modeling approach is used because it allows taking into account local accessibility to some urban and rural amenities (Tannier et al., 2012). For this research, an experiment is performed on a mountain area in the French Alps (Barcelonnette Basin) to generate three scenarios of urban development with MUP-City at the scale of 1:10:000. The results are assessed by comparing the localization of residential developments with urban areas predicted by land cover and land use scenarios generated by cellular automata modelling (LCM and Dyna-clue) (Puissant et al., 2015). Based on these scenarios, the evolution of vulnerability is estimated.

  14. Analysis of seismic signals related to natural and blasting rockfalls (Mount Néron, France)

    NASA Astrophysics Data System (ADS)

    Bottelin, Pierre; Jongmans, Denis; Helmstetter, Agnès; Baillet, Laurent; Hantz, Didier; Daudon, Dominique; Villard, Pascal; Donzé, Frédéric; Richefeu, Vincent; Lorier, Lionel; Cadet, Héloïse; Mathy, Alexandre

    2013-04-01

    identified in the signal are the ground impact following the free fall and the impact of one large block (~9 m3) into the earthen barrier, which exhibit similar amplitudes and low-frequency contents (below 10 Hz). For the provoked rockfall, polarization analysis was conducted on the 3C seismograms windowed for the two main phases and band-pass filtered in the [2-6 Hz] range. The signal generated by the impact on the earthen barrier exhibits a predominant strong linear horizontal ground motion, oriented perpendicular to the barrier. In contrast, the particle motion resulting from the impact on the ground just after the explosion shows a complex pattern with no specific polarization. A finite-element numerical simulation is carried out to understand the seismic energy released and the wave field generated by these impacts.

  15. The analysis of interseismic GPS observation and its implication to seismic activity in Taiwan area

    NASA Astrophysics Data System (ADS)

    Tsai, M. C.; Yu, S. B.; Shin, T. C.

    2015-12-01

    Taiwan is an active tectonic area with about 80 mm/yr plate convergence rate. To understand the crustal deformation and seismic potential in Taiwan area. We derived 2009-2014 interseismic GPS velocity field and strain rate, implicate to seismic activity of 2005-2014. Data collected by 281 sites of Taiwan Continuous GPS (cGPS) Array and processed with GAMIT/GLOBK software. Stacking of power spectral densities from cGPS data in Taiwan, we found the errors type can be described as a combination of white noise and flicker noise. The common errors are removed by stacking 50 cGPS sites with data period larger than 5 years. By removing the common errors, the precision of GPS data has been further improved to 2.3 mm, 1.9 mm, and 6.9 mm in the E, N, U components, respectively. After strictly data quality control, time series analysis and noise analysis, we derive an interseismic ITRF2008 velocity field from 2009 to 2014 in the Taiwan area. The general pattern is quite similar with previous studies, but the station density is much larger and spatial coverage better. Based on this interseismic velocity field, we estimate the crustal strain rate in Taiwan area. Approximately half of plate convergence strain rate is accommodated on the fold and thrust belt of western Taiwan and another half is taken up in the Longitudinal Valley and the Coastal Range in eastern Taiwan. The maximum dilatation rates is about -0.75~-0.9 μstrain/yr in WNW-ESE direction. The velocities in western Taiwan generally show a fan-shaped pattern, consistent with the direction of maximum compression tectonic stress. Extension in the E-W direction is observed at the Central Range area, the focal mechanism results also indicate the earthquake type here most are normal faults. In northern Taiwan, the velocity vectors reveal clockwise rotation, indicating the on-going extensional deformation related to the back-arc extension of the Okinawa Trough. In southern Taiwan, the horizontal velocity increases from

  16. Discontinuum Modelling Approach for Stress Analysis at a Seismic Source: Case Study

    NASA Astrophysics Data System (ADS)

    Sainoki, Atsushi; Mitri, Hani S.; Yao, Mike; Chinnasane, Damodara

    2016-12-01

    Rockbursts in underground mines can cause devastating damage to mine workings; hence, it is important to be able to assess the potential for their occurrence. The present study focuses on a large seismic event that took place at an underground base metal mine in Canada. The event took place in a dyke near the 100/900 orebodies on 3880 Level (1180 m below surface) of the Copper Cliff Mine in Sudbury, Canada. A 3D continuum stress analysis of the orebodies, i.e., 100 and 900, using an orebody-wide model encompassing the major geological structures and in situ stress heterogeneity in the mine shows low potential for rockburst at the seismic source location—a result which contradicts the fact that a large seismic event actually took place. A postulation is thus made that there had been highly stressed regions caused by geological disturbances at the source location before mining activities took place. In order to verify the postulation, a further study is undertaken with the discrete element modelling technique, whereby a cube-shaped model containing a fracture network is subjected to a stress state similar to that at the source location. A model parametrical study is conducted with respect to the distribution of the fracture (joint) network and its mechanical properties. The results reveal that when joints are densely distributed at the source location, the stress state becomes significantly burst prone. It is observed that the length, density, stiffness, and orientation of joints have a large influence on the stress state along the joints, while the friction angle, cohesion, and tensile strength do not influence the stress state. A cube-shaped model is constructed with joint sets actually mapped at the mine and a stress analysis is performed. The results demonstrate the generation of highly stressed regions due to the interaction of the joints with the applied in situ stress fields, thus leading to burst-prone conditions. The present study numerically confirms that

  17. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    SciTech Connect