Science.gov

Sample records for seismic vulnerability analysis

  1. Seismic vulnerability assessments in risk analysis

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2013-04-01

    The assessment of seismic vulnerability is a critical issue within natural and technological risk analysis. In general, there are three common types of methods used for development of vulnerability functions of different elements at risk: empirical, analytical and expert estimations. The paper addresses the empirical methods for seismic vulnerability estimation for residential buildings and industrial facilities. The results of engineering analysis of past earthquake consequences, as well as the statistical data on buildings behavior during strong earthquakes presented in the different seismic intensity scales, are used to verify the regional parameters of mathematical models in order to simulate physical and economic vulnerability for different building types classified according to seismic scale MMSK-86. Verified procedure has been used to estimate the physical and economic vulnerability of buildings and constructions against earthquakes for the Northern Caucasus Federal region of the Russian Federation and Krasnodar area, which are characterized by rather high level of seismic activity and high population density. In order to estimate expected damage states to buildings and constructions in the case of the earthquakes according to the OSR-97B (return period T=1,000 years) within big cities and towns, they were divided into unit sites and their coordinates were presented as dots located in the centers of unit sites. Then the indexes obtained for each unit site were summed up. The maps of physical vulnerability zoning for Northern Caucasus Federal region of the Russian Federation and Krasnodar area includes two elements: percent of different damage states for settlements with number of inhabitants less than 1,000 and vulnerability for cities and towns with number of inhabitants more than 1,000. The hypsometric scale is used to represent both elements on the maps. Taking into account the size of oil pipe line systems located in the highly active seismic zones in

  2. Vector-intensity measure based seismic vulnerability analysis of bridge structures

    NASA Astrophysics Data System (ADS)

    Li, Zhongxian; Li, Yang; Li, Ning

    2014-12-01

    This paper presents a method for seismic vulnerability analysis of bridge structures based on vector-valued intensity measure (vIM), which predicts the limit-state capacities efficiently with multi-intensity measures of seismic event. Accounting for the uncertainties of the bridge model, ten single-bent overpass bridge structures are taken as samples statistically using Latin hypercube sampling approach. 200 earthquake records are chosen randomly for the uncertainties of ground motions according to the site condition of the bridges. The uncertainties of structural capacity and seismic demand are evaluated with the ratios of demand to capacity in different damage state. By comparing the relative importance of different intensity measures, S a ( T 1) and S a ( T 2) are chosen as vIM. Then, the vector-valued fragility functions of different bridge components are developed. Finally, the system-level vulnerability of the bridge based on vIM is studied with Dunnett-Sobel class correlation matrix which can consider the correlation effects of different bridge components. The study indicates that an increment IMs from a scalar IM to vIM results in a significant reduction in the dispersion of fragility functions and in the uncertainties in evaluating earthquake risk. The feasibility and validity of the proposed vulnerability analysis method is validated and the bridge is more vulnerable than any components.

  3. Integrating Social impacts on Health and Health-Care Systems in Systemic Seismic Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Kunz-Plapp, T.; Khazai, B.; Daniell, J. E.

    2012-04-01

    This paper presents a new method for modeling health impacts caused by earthquake damage which allows for integrating key social impacts on individual health and health-care systems and for implementing these impacts in quantitative systemic seismic vulnerability analysis. In current earthquake casualty estimation models, demand on health-care systems is estimated by quantifying the number of fatalities and severity of injuries based on empirical data correlating building damage with casualties. The expected number of injured people (sorted by priorities of emergency treatment) is combined together with post-earthquake reduction of functionality of health-care facilities such as hospitals to estimate the impact on healthcare systems. The aim here is to extend these models by developing a combined engineering and social science approach. Although social vulnerability is recognized as a key component for the consequences of disasters, social vulnerability as such, is seldom linked to common formal and quantitative seismic loss estimates of injured people which provide direct impact on emergency health care services. Yet, there is a consensus that factors which affect vulnerability and post-earthquake health of at-risk populations include demographic characteristics such as age, education, occupation and employment and that these factors can aggravate health impacts further. Similarly, there are different social influences on the performance of health care systems after an earthquake both on an individual as well as on an institutional level. To link social impacts of health and health-care services to a systemic seismic vulnerability analysis, a conceptual model of social impacts of earthquakes on health and the health care systems has been developed. We identified and tested appropriate social indicators for individual health impacts and for health care impacts based on literature research, using available European statistical data. The results will be used to

  4. GPR surveys for the characterization of foundation plinths within a seismic vulnerability analysis

    NASA Astrophysics Data System (ADS)

    De Domenico, Domenica; Teramo, Antonio; Campo, Davide

    2013-06-01

    We present the results of GPR surveys performed to identify the foundation plinths of 12 buildings of a school, whose presence is uncertain since the structural drawings were not available. Their effective characterization is an essential element within a study aimed at assessing the seismic vulnerability of the buildings, which are non-seismically designed structures, located in an area classified as a seismic zone after their construction. Through GPR profiles acquired by two 250 MHz antennas, both in reflection mode and in a WARR configuration, the actual geometry and depth of the building plinths were successfully identified, limiting the number of invasive tests necessary to validate the GPR data interpretation, thus enabling the choice of the most suitable sites that would not alter the serviceability of the structure. The collected data were also critically analysed with reference to local environmental noise that, if causing reflections superimposed on those of the subsoil, could undermine the success of the investigation. Due to the homogeneity of the ground, the processing and results relative to each pair of profiles carried out for all of these buildings is very similar, so the results concerning only two of them are reported.

  5. Seismic Hazard Analysis based on Earthquake Vulnerability and Peak Ground Acceleration using Microseismic Method at Universitas Negeri Semarang

    NASA Astrophysics Data System (ADS)

    Sulistiawan, H.; Supriyadi; Yulianti, I.

    2017-02-01

    Microseismic is a harmonic vibration of land that occurs continuously at a low frequency. The characteristics of microseismic represents the characteristics of the soil layer based on the value of its natural frequency. This paper presents the analysis of seismic hazard at Universitas Negeri Semarang using microseismic method. The data acquisition was done at 20 points with distance between points 300 m by using three component’s seismometer. The data was processed using Horizontal to Vertical Spectral Ratio (HVSR) method to obtain the natural frequency and amplification value. The value of the natural frequency and amplification used to determine the value of the earthquake vulnerability and peak ground acceleration (PGA). The result shows then the earthquake vulnerability value range from 0.2 to 7.5, while the value of the average peak ground acceleration (PGA) is in the range 10-24 gal. Therefore, the average peak ground acceleration equal to earthquake intensity IV MMI scale.

  6. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    NASA Astrophysics Data System (ADS)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  7. Urban Vulnerability Assessment to Seismic Hazard through Spatial Multi-Criteria Analysis. Case Study: the Bucharest Municipality/Romania

    NASA Astrophysics Data System (ADS)

    Armas, Iuliana; Dumitrascu, Silvia; Bostenaru, Maria

    2010-05-01

    In the context of an explosive increase in value of the damage caused by natural disasters, an alarming challenge in the third millennium is the rapid growth of urban population in vulnerable areas. Cities are, by definition, very fragile socio-ecological systems with a high level of vulnerability when it comes to environmental changes and that are responsible for important transformations of the space, determining dysfunctions shown in the state of the natural variables (Parker and Mitchell, 1995, The OFDA/CRED International Disaster Database). A contributing factor is the demographic dynamic that affects urban areas. The aim of this study is to estimate the overall vulnerability of the urban area of Bucharest in the context of the seismic hazard, by using environmental, socio-economic, and physical measurable variables in the framework of a spatial multi-criteria analysis. For this approach the capital city of Romania was chosen based on its high vulnerability due to the explosive urban development and the advanced state of degradation of the buildings (most of the building stock being built between 1940 and 1977). Combining these attributes with the seismic hazard induced by the Vrancea source, Bucharest was ranked as the 10th capital city worldwide in the terms of seismic risk. Over 40 years of experience in the natural risk field shows that the only directly accessible way to reduce the natural risk is by reducing the vulnerability of the space (Adger et al., 2001, Turner et al., 2003; UN/ISDR, 2004, Dayton-Johnson, 2004, Kasperson et al., 2005; Birkmann, 2006 etc.). In effect, reducing the vulnerability of urban spaces would imply lower costs produced by natural disasters. By applying the SMCA method, the result reveals a circular pattern, signaling as hot spots the Bucharest historic centre (located on a river terrace and with aged building stock) and peripheral areas (isolated from the emergency centers and defined by precarious social and economic

  8. Effect of beta on Seismic Vulnerability Curve for RC Bridge Based on Double Damage Criterion

    SciTech Connect

    Feng Qinghai; Yuan Wancheng

    2010-05-21

    In the analysis of seismic vulnerability curve based on double damage criterion, the randomness of structural parameter and randomness of seismic should be considered. Firstly, the distribution characteristics of structure capability and seismic demand are obtained based on IDA and PUSHOVER, secondly, the vulnerability of the bridge is gained based on ANN and MC and a vulnerability curve according to this bridge and seismic is drawn. Finally, the analysis for a continuous bridge is displayed as an example, and parametric analysis for the effect of beta is done, which reflects the bridge vulnerability overall from the point of total probability, and in order to reduce the discreteness, large value of beta are suggested.

  9. Analysis of the impact of large scale seismic retrofitting strategies through the application of a vulnerability-based approach on traditional masonry buildings

    NASA Astrophysics Data System (ADS)

    Ferreira, Tiago Miguel; Maio, Rui; Vicente, Romeu

    2017-04-01

    The buildings' capacity to maintain minimum structural safety levels during natural disasters, such as earthquakes, is recognisably one of the aspects that most influence urban resilience. Moreover, the public investment in risk mitigation strategies is fundamental, not only to promote social and urban and resilience, but also to limit consequent material, human and environmental losses. Despite the growing awareness of this issue, there is still a vast number of traditional masonry buildings spread throughout many European old city centres that lacks of adequate seismic resistance, requiring therefore urgent retrofitting interventions in order to both reduce their seismic vulnerability and to cope with the increased seismic requirements of recent code standards. Thus, this paper aims at contributing to mitigate the social and economic impacts of earthquake damage scenarios through the development of vulnerability-based comparative analysis of some of the most popular retrofitting techniques applied after the 1998 Azores earthquake. The influence of each technique individually and globally studied resorting to a seismic vulnerability index methodology integrated into a GIS tool and damage and loss scenarios are constructed and critically discussed. Finally, the economic balance resulting from the implementation of that techniques are also examined.

  10. Extreme seismicity and disaster risks: Hazard versus vulnerability (Invited)

    NASA Astrophysics Data System (ADS)

    Ismail-Zadeh, A.

    2013-12-01

    Although the extreme nature of earthquakes has been known for millennia due to the resultant devastation from many of them, the vulnerability of our civilization to extreme seismic events is still growing. It is partly because of the increase in the number of high-risk objects and clustering of populations and infrastructure in the areas prone to seismic hazards. Today an earthquake may affect several hundreds thousand lives and cause significant damage up to hundred billion dollars; it can trigger an ecological catastrophe if occurs in close vicinity to a nuclear power plant. Two types of extreme natural events can be distinguished: (i) large magnitude low probability events, and (ii) the events leading to disasters. Although the first-type events may affect earthquake-prone countries directly or indirectly (as tsunamis, landslides etc.), the second-type events occur mainly in economically less-developed countries where the vulnerability is high and the resilience is low. Although earthquake hazards cannot be reduced, vulnerability to extreme events can be diminished by monitoring human systems and by relevant laws preventing an increase in vulnerability. Significant new knowledge should be gained on extreme seismicity through observations, monitoring, analysis, modeling, comprehensive hazard assessment, prediction, and interpretations to assist in disaster risk analysis. The advanced disaster risk communication skill should be developed to link scientists, emergency management authorities, and the public. Natural, social, economic, and political reasons leading to disasters due to earthquakes will be discussed.

  11. A Methodology for Assessing the Seismic Vulnerability of Highway Systems

    SciTech Connect

    Cirianni, Francis; Leonardi, Giovanni; Scopelliti, Francesco

    2008-07-08

    Modern society is totally dependent on a complex and articulated infrastructure network of vital importance for the existence of the urban settlements scattered on the territory. On these infrastructure systems, usually indicated with the term lifelines, are entrusted numerous services and indispensable functions of the normal urban and human activity.The systems of the lifelines represent an essential element in all the urbanised areas which are subject to seismic risk. It is important that, in these zones, they are planned according to opportune criteria based on two fundamental assumptions: a) determination of the best territorial localization, avoiding, within limits, the places of higher dangerousness; b) application of constructive technologies finalized to the reduction of the vulnerability.Therefore it is indispensable that in any modern process of seismic risk assessment the study of the networks is taken in the rightful consideration, to be integrated with the traditional analyses of the buildings.The present paper moves in this direction, dedicating particular attention to one kind of lifeline: the highway system, proposing a methodology of analysis finalized to the assessment of the seismic vulnerability of the system.

  12. Evaluation Of The Seismic Vulnerability of Fortified Structures

    NASA Astrophysics Data System (ADS)

    Baratta, Alessandro; Corbi, Ileana; Coppari, Sandro

    2008-07-01

    In the paper a prompt method to evaluate the seismic vulnerability of an ancient structure has been applied to the seismic vulnerability of the fortified structures in Italy, having as basics the elaboration of rather gross information about the state, the consistency and the history of the considered population of fabrics. The procedure proves to be rather effective and able to produce reliable results, despite the poor initial data.

  13. Evaluation Of The Seismic Vulnerability of Fortified Structures

    SciTech Connect

    Baratta, Alessandro; Corbi, Ileana; Coppari, Sandro

    2008-07-08

    In the paper a prompt method to evaluate the seismic vulnerability of an ancient structure has been applied to the seismic vulnerability of the fortified structures in Italy, having as basics the elaboration of rather gross information about the state, the consistency and the history of the considered population of fabrics. The procedure proves to be rather effective and able to produce reliable results, despite the poor initial data.

  14. Automated Software Vulnerability Analysis

    NASA Astrophysics Data System (ADS)

    Sezer, Emre C.; Kil, Chongkyung; Ning, Peng

    Despite decades of research, software continues to have vulnerabilities. Successful exploitations of these vulnerabilities by attackers cost millions of dollars to businesses and individuals. Unfortunately, most effective defensive measures, such as patching and intrusion prevention systems, require an intimate knowledge of the vulnerabilities. Many systems for detecting attacks have been proposed. However, the analysis of the exploited vulnerabilities is left to security experts and programmers. Both the human effortinvolved and the slow analysis process are unfavorable for timely defensive measure to be deployed. The problem is exacerbated by zero-day attacks.

  15. Seismic Vulnerability and Performance Level of confined brick walls

    SciTech Connect

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-08

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material.Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iran is located on the earthquake belt of Alpide.Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures.In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.

  16. Seismic Vulnerability and Performance Level of confined brick walls

    NASA Astrophysics Data System (ADS)

    Ghalehnovi, M.; Rahdar, H. A.

    2008-07-01

    There has been an increase on the interest of Engineers and designers to use designing methods based on displacement and behavior (designing based on performance) Regarding to the importance of resisting structure design against dynamic loads such as earthquake, and inability to design according to prediction of nonlinear behavior element caused by nonlinear properties of constructional material. Economically speaking, easy carrying out and accessibility of masonry material have caused an enormous increase in masonry structures in villages, towns and cities. On the other hand, there is a necessity to study behavior and Seismic Vulnerability in these kinds of structures since Iran is located on the earthquake belt of Alpide. Different reasons such as environmental, economic, social, cultural and accessible constructional material have caused different kinds of constructional structures. In this study, some tied walls have been modeled with software and with relevant accelerator suitable with geology conditions under dynamic analysis to research on the Seismic Vulnerability and performance level of confined brick walls. Results from this analysis seem to be satisfactory after comparison of them with the values in Code ATC40, FEMA and standard 2800 of Iran.

  17. Remote sensing techniques applied to seismic vulnerability assessment

    NASA Astrophysics Data System (ADS)

    Juan Arranz, Jose; Torres, Yolanda; Hahgi, Azade; Gaspar-Escribano, Jorge

    2016-04-01

    Advances in remote sensing and photogrammetry techniques have increased the degree of accuracy and resolution in the record of the earth's surface. This has expanded the range of possible applications of these data. In this research, we have used these data to document the construction characteristics of the urban environment of Lorca, Spain. An exposure database has been created with the gathered information to be used in seismic vulnerability assessment. To this end, we have used data from photogrammetric flights at different periods, using both orthorectified images in the visible and infrared spectrum. Furthermore, the analysis is completed using LiDAR data. From the combination of these data, it has been possible to delineate the building footprints and characterize the constructions with attributes such as the approximate date of construction, area, type of roof and even building materials. To carry out the calculation, we have developed different algorithms to compare images from different times, segment images, classify LiDAR data, and use the infrared data in order to remove vegetation or to compute roof surfaces with height value, tilt and spectral fingerprint. In addition, the accuracy of our results has been validated with ground truth data. Keywords: LiDAR, remote sensing, seismic vulnerability, Lorca

  18. Evaluation of socio-spatial vulnerability of citydwellers and analysis of risk perception: industrial and seismic risks in Mulhouse

    NASA Astrophysics Data System (ADS)

    Glatron, S.; Beck, E.

    2008-10-01

    Social vulnerability has been studied for years with sociological, psychological and economical approaches. Our proposition focuses on perception and cognitive representations of risks by city dwellers living in a medium size urban area, namely Mulhouse (France). Perception, being part of the social vulnerability and resilience of the society to disasters, influences the potential damage; for example it leads to adequate or inadequate behaviour in the case of an emergency. As geographers, we assume that the spatial relationship to danger or hazard can be an important factor of vulnerability and we feel that the spatial dimension is a challenging question either for better knowledge or for operational reasons (e.g. management of preventive information). We interviewed 491 people, inhabitants and workers, regularly distributed within the urban area to get to know their opinion on hazards and security measures better. We designed and mapped a vulnerability index on the basis of their answers. The results show that the social vulnerability depends on the type of hazard, and that the distance to the source of danger influences the vulnerability, especially for hazards with a precise location (industrial for example). Moreover, the effectiveness of the information campaigns is doubtful, as the people living close to hazardous industries (target of specific preventive information) are surprisingly more vulnerable and less aware of industrial risk.

  19. Approaches of Seismic Vulnerability Assessments in Near Real Time Systems

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Alexander

    2014-05-01

    Data on seismic vulnerability of existing building stock and other elements at risk are rather important for near real time earthquake loss estimations by global systems. These data together with information on regional peculiarities of seismic intensity attenuation and other factors contribute greatly to the reliability of strong event consequences estimated in emergency mode. There are different approaches for vulnerability functions' development and the empirical one is most often used. It is based on analysis of engineering consequences of past strong events when well documented descriptions of damage to different building types and other elements at risk are available for the earthquake prone area under consideration. In the case such data do not exist the information from macroseismic scales may be used. Any approach of vulnerability functions' development requires the proper classification of buildings and structures under consideration. According to national and international building codes, as well as macroseismic scales different buildings' classifications exist. As a result the global systems, such as Extremum and PAGER, as well as GEM project make use of the non-unified information on building stock distribution worldwide. The paper addresses the issues of buildings' classification and city models in terms of these classifications. Distribution of different buildings types in Extremum and PAGER/GEM systems is analyzed for earthquake prone countries. The comparison of city models revealed significant differences which influence greatly earthquake loss estimations in emergency mode. The paper describes the practice of city models' development which make use of space images and web technology in social networks. It is proposed to use the G8 country (and other) initiatives related to open data and transparency aimed at improving building stock distribution and global population databases.

  20. Highway bridge seismic design: Summary of FHWA/MCEER project on seismic vulnerability of new highway construction

    NASA Astrophysics Data System (ADS)

    Friedland, Ian M.; Buckle, Ian G.; Lee, George C.

    2002-06-01

    The Federal Highway Administration (FHWA) sponsored a large, multi-year project conducted by the Multidisciplinary Center for Earthquake Engineering Research (MCEER) titled “Seismic Vulnerability of New Highway Construction” (MCEER Project 112), which was completed in 1998. MCEER coordinated the work of many researchers, who performed studies on the seismic design and vulnerability analysis of highway bridges, tunnels, and retaining structures. Extensive research was conducted to provide revisions and improvements to current design and detailing approaches and national design specifications for highway bridges. The program included both analytical and experimental studies, and addressed seismic hazard exposure and ground motion input for the U.S. highway system; foundation design and soil behavior; structural importance, analysis, and response; structural design issues and details; and structural design criteria.

  1. Key geophysical indicators of seismic vulnerability in Kingston, Jamaica

    NASA Astrophysics Data System (ADS)

    Brown, L. A.; Hornbach, M. J.; Salazar, W.; Kennedy, M.

    2012-12-01

    Kingston, the major city and hub of all commercial and industrial activity in Jamaica, has a history of moderate seismic activity; however, two significant (>Mw 6) earthquakes (1692 and 1907) caused major devastation resulting in thousands of casualties. Both the 1692 and 1907 events also triggered widespread liquefaction and tsunamis within Kingston Harbor. Kingston remains vulnerable to these earthquakes today because the city sits on 200-m to 600-m thick alluvial fan deposits adjacent to the Enriquillo-Plantain Garden Fault—the same fault system that activated during the Haiti 2010 earthquake. Recent GPS results suggest the potential for a Mw 7-7.5 earthquake near Kingston along the Enriquillo- Plantain Garden fault Zone (EPGFZ), the dominant east-west trending fault through Jamaica. Whether active strands EPGFZ extend through downtown Kingston remains unclear, however, recent sonar mapping in Kingston harbor show evidence for active faulting, with offshore faults connecting to proposed active on-land fault systems that extend through populated areas of the city. Seismic "Chirp" reflections also shows evidence for multiple recent (Holocene) submarine slide deposits in the harbor that may be associated with historic tsunamis. Using recently acquired chirp and sediment cores, we are currently studying the recurrence interval of earthquake events. We also recently performed a microtremor survey to identify areas prone to earthquake-induced ground shaking throughout the city of Kingston & St. Andrew parish. Data was collected at 200 points with a lateral spacing of 500 metres between each point. Our analysis shows significant variations in the fundamental frequency across the city and results clearly indicate areas of potential amplification, with areas surrounding Kingston harbor (much of which has been built on reclaimed land) showing the highest potential for ground amplification. The microtremor analysis suggests several high-density urban areas as well as key

  2. Rapid Assessment of Seismic Vulnerability in Palestinian Refugee Camps

    NASA Astrophysics Data System (ADS)

    Al-Dabbeek, Jalal N.; El-Kelani, Radwan J.

    Studies of historical and recorded earthquakes in Palestine demonstrate that damaging earthquakes are occurring frequently along the Dead Sea Transform: Earthquake of 11 July 1927 (ML 6.2), Earthquake of 11 February 2004 (ML 5.2). In order to reduce seismic vulnerability of buildings, losses in lives, properties and infrastructures, an attempt was made to estimate the percentage of damage degrees and losses at selected refugee camps: Al Ama`ri, Balata and Dhaishe. Assessing the vulnerability classes of building structures was carried out according to the European Macro-Seismic Scale 1998 (EMS-98) and the Fedral Emergency Management Agency (FEMA). The rapid assessment results showed that very heavy structural and non structural damages will occur in the common buildings of the investigated Refugee Camps (many buildings will suffer from damages grades 4 and 5). Bad quality of buildings in terms of design and construction, lack of uniformity, absence of spaces between the building and the limited width of roads will definitely increase the seismic vulnerability under the influence of moderate-strong (M 6-7) earthquakes in the future.

  3. Seismic vulnerability and risk assessment of Kolkata City, India

    NASA Astrophysics Data System (ADS)

    Nath, S. K.; Adhikari, M. D.; Devaraj, N.; Maiti, S. K.

    2015-06-01

    The city of Kolkata is one of the most urbanized and densely populated regions in the world and a major industrial and commercial hub of the eastern and northeastern region of India. In order to classify the seismic risk zones of Kolkata we used seismic hazard exposures on the vulnerability components, namely land use/land cover, population density, building typology, age and height. We microzoned seismic hazard of the city by integrating seismological, geological and geotechnical themes in GIS, which in turn are integrated with the vulnerability components in a logic-tree framework for the estimation of both the socioeconomic and structural risk of the city. In both the risk maps, three broad zones have been demarcated as "severe", "high" and "moderate". There had also been a risk-free zone in the city that is termed as "low". The damage distribution in the city due to the 1934 Bihar-Nepal earthquake of Mw = 8.1 matches satisfactorily well with the demarcated risk regime. The design horizontal seismic coefficients for the city have been worked out for all the fundamental periods that indicate suitability for "A", "B" and "C" type of structures. The cumulative damage probabilities in terms of "none", "slight", "moderate", "extensive" and "complete" have also been assessed for the predominantly four model building types viz. RM2L, RM2M, URML and URMM for each seismic structural risk zone in the city. Both the seismic hazard and risk maps are expected to play vital roles in the earthquake-inflicted disaster mitigation and management of the city of Kolkata.

  4. Rupture Directivity Effect on Seismic Vulnerability of Reinforced Concrete Bridge

    NASA Astrophysics Data System (ADS)

    Shirazian, Shadi; Nouri, Gholamreza; Ghayamghamian, Mohamadreza

    2017-04-01

    Earthquake catastrophes menace humans` lives and assets. Although earthquakes are inevitable, damage is not. To remedy this situation, significant amount of research is conducted in order to assess the performance of existent man-made structures, particularly infrastructures such as bridges which play a vital role in post earthquake services. The results can be used for assessing retrofit prioritization for structures and as a basis for economic loss estimations. The research presented here determines the vulnerability of a common typical two-span reinforced concrete bridge by generating fragility curves. Near-fault ground motions are different from ordinary ground motions, often containing strong coherent dynamic long-period pulses and permanent ground displacements. Here special attention is given to this type of ground motions, and their effects on the seismic behavior of structure are compared with ordinary motions. The results show near-fault ground motions exacerbate the seismic vulnerability of a bridge by about 68% in comparison with near-field ground motions. In other words, near-source ground motions with forward directivity effect are more dangerous.

  5. SANITARY VULNERABILITY OF A TERRITORIAL SYSTEM IN HIGH SEISMIC AREAS

    NASA Astrophysics Data System (ADS)

    Teramo, A.; Termini, D.; de Domenico, D.; Marino, A.; Marullo, A.; Saccà, C.; Teramo, M.

    2009-12-01

    An evaluation procedure of sanitary vulnerability of a territorial system falling within a high seismic risk area, related to casualty treatment capability of hospitals after an earthquake, is proposed. The goal of the study is aimed at highlighting hospital criticalities for the arrangement of a prevention policy on the basis of territorial, demographic and sanitary type specific analyses of a given area. This is the first step of a procedure of territorial context reading within a damage scenario, addressed to a verification of preparedness level of the territorial system to a sanitary emergency referable both to a natural disaster and anthropic one. The results of carried out surveys are shown, at a different scale, on several sample areas of Messina Province (Italy) territory, evaluating the consistency of damage scenario with the number of casualties, medical doctors, available beds for the implementation of a emergency sanitary circuit.

  6. The influence of local mechanisms on large scale seismic vulnerability estimation of masonry building aggregates

    NASA Astrophysics Data System (ADS)

    Formisano, Antonio; Chieffo, Nicola; Milo, Bartolomeo; Fabbrocino, Francesco

    2016-12-01

    The current paper deals with the seismic vulnerability evaluation of masonry constructions grouped in aggregates through an "ad hoc" quick vulnerability form based on new assessment parameters considering local collapse mechanisms. First, a parametric kinematic analysis on masonry walls with different height (h) / thickness (t) ratios has been developed with the purpose of identifying the collapse load multiplier for activation of the main four first-order failure mechanisms. Subsequently, a form initially conceived for building aggregates suffering second-mode collapse mechanisms, has been expanded on the basis of the achieved results. Tre proposed quick vulnerability technique has been applied to one case study within the territory of Arsita (Teramo, Italy) and, finally, it has been also validated by the comparison of results with those deriving from application of the well-known FaMIVE procedure.

  7. Fault zone regulation, seismic hazard, and social vulnerability in Los Angeles, California: Hazard or urban amenity?

    NASA Astrophysics Data System (ADS)

    Toké, Nathan A.; Boone, Christopher G.; Arrowsmith, J. Ramón

    2014-09-01

    Public perception and regulation of environmental hazards are important factors in the development and configuration of cities. Throughout California, probabilistic seismic hazard mapping and geologic investigations of active faults have spatially quantified earthquake hazard. In Los Angeles, these analyses have informed earthquake engineering, public awareness, the insurance industry, and the government regulation of developments near faults. Understanding the impact of natural hazards regulation on the social and built geography of cities is vital for informing future science and policy directions. We constructed a relative social vulnerability index classification for Los Angeles to examine the social condition within regions of significant seismic hazard, including areas regulated as Alquist-Priolo (AP) Act earthquake fault zones. Despite hazard disclosures, social vulnerability is lowest within AP regulatory zones and vulnerability increases with distance from them. Because the AP Act requires building setbacks from active faults, newer developments in these zones are bisected by parks. Parcel-level analysis demonstrates that homes adjacent to these fault zone parks are the most valuable in their neighborhoods. At a broad scale, a Landsat-based normalized difference vegetation index shows that greenness near AP zones is greater than the rest of the metropolitan area. In the parks-poor city of Los Angeles, fault zone regulation has contributed to the construction of park space within areas of earthquake hazard, thus transforming zones of natural hazard into amenities, attracting populations of relatively high social status, and demonstrating that the distribution of social vulnerability is sometimes more strongly tied to amenities than hazards.

  8. Safeguard Vulnerability Analysis Program (SVAP)

    SciTech Connect

    Gilman, F.M.; Dittmore, M.H.; Orvis, W.J.; Wahler, P.S.

    1980-06-23

    This report gives an overview of the Safeguard Vulnerability Analysis Program (SVAP) developed at Lawrence Livermore National Laboratory. SVAP was designed as an automated method of analyzing the safeguard systems at nuclear facilities for vulnerabilities relating to the theft or diversion of nuclear materials. SVAP addresses one class of safeguard threat: theft or diversion of nuclear materials by nonviolent insiders, acting individually or in collusion. SVAP is a user-oriented tool which uses an interactive input medium for preprocessing the large amounts of safeguards data. Its output includes concise summary data as well as detailed vulnerability information.

  9. Vulnerability survival analysis: a novel approach to vulnerability management

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; Sullivan, John; Cybenko, George

    2017-05-01

    Computer security vulnerabilities span across large, enterprise networks and have to be mitigated by security engineers on a routine basis. Presently, security engineers will assess their "risk posture" through quantifying the number of vulnerabilities with a high Common Vulnerability Severity Score (CVSS). Yet, little to no attention is given to the length of time by which vulnerabilities persist and survive on the network. In this paper, we review a novel approach to quantifying the length of time a vulnerability persists on the network, its time-to-death, and predictors of lower vulnerability survival rates. Our contribution is unique in that we apply the cox proportional hazards regression model to real data from an operational IT environment. This paper provides a mathematical overview of the theory behind survival analysis methods, a description of our vulnerability data, and an interpretation of the results.

  10. Integrated Estimation of Seismic Physical Vulnerability of Tehran Using Rule Based Granular Computing

    NASA Astrophysics Data System (ADS)

    Sheikhian, H.; Delavar, M. R.; Stein, A.

    2015-08-01

    Tehran, the capital of Iran, is surrounded by the North Tehran fault, the Mosha fault and the Rey fault. This exposes the city to possibly huge earthquakes followed by dramatic human loss and physical damage, in particular as it contains a large number of non-standard constructions and aged buildings. Estimation of the likely consequences of an earthquake facilitates mitigation of these losses. Mitigation of the earthquake fatalities may be achieved by promoting awareness of earthquake vulnerability and implementation of seismic vulnerability reduction measures. In this research, granular computing using generality and absolute support for rule extraction is applied. It uses coverage and entropy for rule prioritization. These rules are combined to form a granule tree that shows the order and relation of the extracted rules. In this way the seismic physical vulnerability is assessed, integrating the effects of the three major known faults. Effective parameters considered in the physical seismic vulnerability assessment are slope, seismic intensity, height and age of the buildings. Experts were asked to predict seismic vulnerability for 100 randomly selected samples among more than 3000 statistical units in Tehran. The integrated experts' point of views serve as input into granular computing. Non-redundant covering rules preserve the consistency in the model, which resulted in 84% accuracy in the seismic vulnerability assessment based on the validation of the predicted test data against expected vulnerability degree. The study concluded that granular computing is a useful method to assess the effects of earthquakes in an earthquake prone area.

  11. Lunar seismic data analysis

    NASA Technical Reports Server (NTRS)

    Nakamura, Y.; Latham, G. V.; Dorman, H. J.

    1982-01-01

    The scientific data transmitted continuously from all ALSEP (Apollo Lunar Surface Experiment Package) stations on the Moon and recorded on instrumentation tapes at receiving stations distributed around the Earth were processed. The processing produced sets of computer-compatible digital tapes, from which various other data sets convenient for analysis were generated. The seismograms were read, various types of seismic events were classified; the detected events were cataloged.

  12. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    NASA Astrophysics Data System (ADS)

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-01

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis. There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need. For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used. Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied. The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  13. Seismic evaluation of vulnerability for SAMA educational buildings in Tehran

    SciTech Connect

    Amini, Omid Nassiri; Amiri, Javad Vaseghi

    2008-07-08

    Earthquake is a destructive phenomenon that trembles different parts of the earth yearly and causes many destructions. Iran is one of the (high seismicity) quack- prone parts of the world that has received a lot of pecuniary damages and life losses each year, schools are of the most important places to be protected during such crisis.There was no special surveillance on designing and building of school's building in Tehran till the late 70's, and as Tehran is on faults, instability of such buildings may cause irrecoverable pecuniary damages and especially life losses, therefore preventing this phenomenon is in an urgent need.For this purpose, some of the schools built during 67-78 mostly with Steel braced frame structures have been selected, first, by evaluating the selected Samples, gathering information and Visual Survey, the prepared questionnaires were filled out. With the use of ARIA and SABA (Venezuela) Methods, new modified combined method for qualified evaluations was found and used.Then, for quantified evaluation, with the use of computer 3D models and nonlinear statically analysis methods, a number of selected buildings of qualified evaluation, were reevaluated and finally with nonlinear dynamic analysis method the real behavior of structures on the earthquakes is studied.The results of qualified and quantified evaluations were compared and a proper Pattern for seismic evaluation of Educational buildings was presented. Otherwise the results can be a guidance for the person in charge of retrofitting or if necessary rebuilding the schools.

  14. Seismic Vulnerability Assessment Rest House Building TA-16-41

    SciTech Connect

    Cuesta, Isabel; Salmon, Michael W.

    2003-10-01

    The purpose of this report is to present the results of the evaluation completed on the Rest House Facility (TA-16-4111) in support of hazard analysis for a Documented Safety Assessment (DSA). The Rest House facility has been evaluated to verify the structural response to seismic, wind, and snow loads in support of the DynEx DSA. The structural analyses consider the structure and the following systems and/or components inside the facility as requested by facility management: cranes, lighting protection system, and fire protection system. The facility has been assigned to Natural Phenomena Hazards (NPH) Performance Category (PC) –3. The facility structure was evaluated to PC-3 criteria because it serves to confine hazardous material, and in the event of an accident, the facility cannot fail or collapse. Seismicinduced failure of the cranes, lighting, and fire-protection systems according to DOE-STD-1021-93 (Ref. 1) “may result in adverse release consequences greater than safety-class Structures, Systems, and Components (SSC) Evaluation Guideline limits but much less than those associated with PC-4 SSC.” Therefore, these items will be evaluated to PC-3 criteria as well. This report presents the results of those analyses and suggests recommendations to improve the seismic capacity of the systems and components cited above.

  15. Use of expert judgment elicitation to estimate seismic vulnerability of selected building types

    USGS Publications Warehouse

    Jaiswal, K.S.; Aspinall, W.; Perkins, D.; Wald, D.; Porter, K.A.

    2012-01-01

    Pooling engineering input on earthquake building vulnerability through an expert judgment elicitation process requires careful deliberation. This article provides an overview of expert judgment procedures including the Delphi approach and the Cooke performance-based method to estimate the seismic vulnerability of a building category.

  16. GIS-based seismic shaking slope vulnerability map of Sicily (Central Mediterranean)

    NASA Astrophysics Data System (ADS)

    Nigro, Fabrizio; Arisco, Giuseppe; Perricone, Marcella; Renda, Pietro; Favara, Rocco

    2010-05-01

    Earthquakes often represent very dangerouses natural events in terms of human life and economic losses and their damage effects are amplified by the synchronous occurrence of seismically-induced ground-shaking failures in wide regions around the seismogenic source. In fact, the shaking associated with big earthquakes triggers extensive landsliding, sometimes at distances of more than 100 km from the epicenter. The active tectonics and the geomorphic/morphodinamic pattern of the regions affected by earthquakes contribute to the slopes instability tendency. In fact, earthquake-induced groun-motion loading determines inertial forces activation within slopes that, combined with the intrinsic pre-existing static forces, reduces the slope stability towards its failure. Basically, under zero-shear stress reversals conditions, a catastrophic failure will take place if the earthquake-induced shear displacement exceeds the critical level of undrained shear strength to a value equal to the gravitational shear stress. However, seismic stability analyses carried out for various infinite slopes by using the existing Newmark-like methods reveal that estimated permanent displacements smaller than the critical value should also be regarded as dangerous for the post-earthquake slope safety, in terms of human activities use. Earthquake-induced (often high-speed) landslides are among the most destructive phenomena related to slopes failure during earthquakes. In fact, damage from earthquake-induced landslides (and other ground-failures), sometimes exceeds the buildings/infrastructures damage directly related to ground-shaking for fault breaking. For this matter, several hearthquakes-related slope failures methods have been developed, for the evaluation of the combined hazard types represented by seismically ground-motion landslides. The methodologies of analysis of the engineering seismic risk related to the slopes instability processes is often achieved through the evaluation of the

  17. Metadata for selecting or submitting generic seismic vulnerability functions via GEM's vulnerability database

    USGS Publications Warehouse

    Jaiswal, Kishor

    2013-01-01

    This memo lays out a procedure for the GEM software to offer an available vulnerability function for any acceptable set of attributes that the user specifies for a particular building category. The memo also provides general guidelines on how to submit the vulnerability or fragility functions to the GEM vulnerability repository, stipulating which attributes modelers must provide so that their vulnerability or fragility functions can be queried appropriately by the vulnerability database. An important objective is to provide users guidance on limitations and applicability by providing the associated modeling assumptions and applicability of each vulnerability or fragility function.

  18. Physical Seismic Vulnerability Assessment of Tehran Using the Integration of Granular Computing and Interval - Shafer

    NASA Astrophysics Data System (ADS)

    Delavar, M. R.; Bahrami, M.; Zare, M.

    2017-09-01

    Several faults exist in the vicinity of Tehran, the capital of Iran such as North Tehran, Ray, Mosha and Kahrizak. One way to assist reducing the damage caused by the earthquake is the production of a seismic vulnerability map. The study area in this research is Tehran, based on the assumption of the activation of North Tehran fault. Degree of Physical seismic vulnerability caused by the earthquake depends on a number of criteria. In this study the intensity of the earthquake, land slope, numbers of buildings' floors as well as their materials are considered as the effective parameters. Hence, the production of the seismic vulnerability map is a multi criteria issue. In this problem, the main source of uncertainty is related to the experts' opinions regarding the seismic vulnerability of Tehran statistical units. The main objectives of this study are to exploit opinions of the experts, undertaking interval computation and interval Dempster-Shafer combination rule to reduce the uncertainty in the opinions of the experts and customizing granular computing to extract the rules and to produce Tehran physical seismic vulnerability map with a higher confidence. Among 3174 statistical units of Tehran, 150 units were randomly selected and using interval computation, their physical vulnerabilities were determined by the experts in earthquake-related fields. After the fusion of the experts' opinions using interval Dempster-Shafer, the information table is prepared as the input to granular computing and then rules are extracted with minimum inconsistency. Finally, the seismic physical vulnerability map of Tehran was produced with % 72 accuracy.

  19. A S.M.A.R.T. system for the seismic vulnerability mitigation of Cultural Heritages

    NASA Astrophysics Data System (ADS)

    Montuori, Antonio; Costanzo, Antonio; Gaudiosi, Iolanda; Vecchio, Antonio; Minasi, Mario; Falcone, Sergio; La Piana, Carmelo; Stramondo, Salvatore; Casula, Giuseppe; Giovanna Bianchi, Maria; Fabrizia Buongiorno, Maria; Musacchio, Massimo; Doumaz, Fawzi; Ilaria Pannaccione Apa, Maria

    2016-04-01

    Both assessment and mitigation of seismic vulnerability connected to cultural heritages monitoring are non-trivial issues, based on the knowledge of structural and environmental factors potential impacting the cultural heritage. A holistic approach could be suitable to provide an effective monitoring of cultural heritages within their surroundings at different spatial and temporal scales. On the one hand, the analysis about geometrical and structural properties of monuments is important to assess their state of conservation, their response to external stresses as well as anomalies related to natural and/or anthropogenic phenomena (e.g. the aging of materials, seismic stresses, vibrational modes). On the other hand, the investigation of the surrounding area is relevant to assess environmental properties and natural phenomena (e.g. landslides, earthquakes, subsidence, seismic response) as well as their related impacts on the monuments. Within such a framework, a multi-disciplinary system has been developed and here presented for the monitoring of cultural heritages for seismic vulnerability assessment and mitigation purposes*. It merges geophysical investigations and modeling, in situ measurements and multi-platforms remote sensing sensors for the non-destructive and non-invasive multi-scales monitoring of historic buildings in a seismic-prone area. In detail, the system provides: a) the long-term and the regional-scale analysis of buildings' environment through the integration of seismogenic analysis, airborne magnetic surveys, space-borne Synthetic Aperture Radar (SAR) and multi-spectral sensors. They allow describing the sub-surface fault systems, the surface deformation processes and the land use mapping of the regional-scale area on an annual temporal span; b) the short-term and the basin-scale analysis of building's neighborhood through geological setting and geotechnical surveys, airborne Light Detection And Radar (LiDAR) and ground-based SAR sensors. They

  20. Seismic vulnerability assessment of school buildings in Tehran city based on AHP and GIS

    NASA Astrophysics Data System (ADS)

    Panahi, M.; Rezaie, F.; Meshkani, S. A.

    2014-04-01

    The objective of the current study is to evaluate the seismic vulnerability of school buildings in Tehran city based on the analytic hierarchy process (AHP) and geographical information system (GIS). To this end, the peak ground acceleration, slope, and soil liquefaction layers were utilized for developing a geotechnical map. Also, the construction materials of structures, age of construction, the quality, and the seismic resonance coefficient layers were defined as major factors affecting the structural vulnerability of school buildings. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion via pairwise comparison in all levels. Finally, the geotechnical and structural spatial layers were overlaid to develop the seismic vulnerability map of school buildings in Tehran. The results indicated that only in 72 (about 3%) out of 2125 school buildings of the study area will the destruction rate be very high and therefore their reconstruction should seriously be considered.

  1. Seismic vulnerability assessment of school buildings in Tehran city based on AHP and GIS

    NASA Astrophysics Data System (ADS)

    Panahi, M.; Rezaie, F.; Meshkani, S. A.

    2013-09-01

    The objective of the study was to evaluate the seismic vulnerability of school buildings in Tehran city based on analytical hierarchical process (AHP) and geographical information systems (GIS). Therefore, to this end, the peak ground acceleration, slope and soil liquefaction layers were used for preparation geotechnical map. Also, the construction materials of structures, year of construction, their quality and seismic resonance coefficient layers were defined as major affecting factors in structural vulnerability of schools. Then, the AHP method was applied to assess the priority rank and weight of criteria (layers) and alternatives (classes) of each criterion through pair wise comparison in all levels. Finally, geotechnical and structural spatial layers were overlaid to prepare the seismic vulnerability map of school buildings in Tehran city. The results indicated that only in 72 schools (about 3%) out of 2125 schools in the study area, the destruction rate is very high and therefore their reconstruction should be considered.

  2. Survey Methods for Seismic Vulnerability Assessment of Historical Masonry Buildings

    NASA Astrophysics Data System (ADS)

    Ballarin, M.; Balletti, C.; Faccio, P.; Guerra, F.; Saetta, A.; Vernier, P.

    2017-05-01

    On 20th and 29th of May 2012, two powerful earthquakes struck northern Italy. The epicentres were recorded respectively in Finale Emilia (magnitude 5.9 Ml) and Medolla (magnitude 5.8 Ml) in the province of Modena, though the earthquake was formed by a series of seismic shakes located in the district of the Emilian Po Valley, mainly in the provinces of Modena, Ferrara, Mantova, Reggio Emilia, Bologna and Rovigo. Many monuments in the city of Mantova were hit by the earthquake and, among these, Palazzo Ducale with the well-known Castello di San Giorgio which host the noteworthy "Camera degli Sposi". This building, the most famous of the city, was so damaged that it was closed for more than one year after the earthquake. The emblem of the Palace and Mantova itself, the previously cited "Camera degli Sposi" realized by Andrea Mantegna, was damaged and all the economic and social life of the city was deeply affected. Immediately after the earthquake, the Soprintendenza per i Beni Architettonici e Paesaggistici of Brescia, Cremona and Mantova establish an agreement with the University Iuav of Venice, requiring an analysis and assessment of the damage in order to proceed with the development of an intervention project. This activity turned out to be very important not only from the point of view of the recovery of the architectural and artistic heritage but also because the city's economy is based primarily on tourism. The closure of one of the most important monuments of Mantova has led to a significant and alarming decline in the government income.

  3. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    SciTech Connect

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  4. Seismic vulnerability: theory and application to Algerian buildings

    NASA Astrophysics Data System (ADS)

    Mebarki, Ahmed; Boukri, Mehdi; Laribi, Abderrahmane; Farsi, Mohammed; Belazougui, Mohamed; Kharchi, Fattoum

    2014-04-01

    results to the observed damages. For pre-earthquake analysis, the methodology widely used around the world relies on the prior calibration of the seismic response of the structures under given expected scenarios. As the structural response is governed by the constitutive materials and structural typology as well as the seismic input and soil conditions, the damage prediction depends intimately on the accuracy of the so-called fragility curve and response spectrum established for each type of structure (RC framed structures, confined or unconfined masonry, etc.) and soil (hard rock, soft soil, etc.). In the present study, the adaptation to Algerian buildings concerns the specific soil conditions as well as the structural dynamic response. The theoretical prediction of the expected damages is helpful for the calibration of the methodology. Thousands (˜3,700) of real structures and the damages caused by the earthquake (Algeria, Boumerdes: Mw = 6.8, May 21, 2003) are considered for the a posteriori calibration and validation process. The theoretical predictions show the importance of the elastic response spectrum, the local soil conditions, and the structural typology. Although the observed and predicted categories of damage are close, it appears that the existing form used for the visual damage inspection would still require further improvements, in order to allow easy evaluation and identification of the damage level. These methods coupled to databases, and GIS tools could be helpful for the local and technical authorities during the post-earthquake evaluation process: real time information on the damage extent at urban or regional scales as well as the extent of losses and the required resources for reconstruction, evacuation, strengthening, etc.

  5. Vulnerability of populations and man-made facilities to seismic hazards

    NASA Astrophysics Data System (ADS)

    Badal, J.; Vazquez-Prada, M.; Gonzalez, A.; Chourak, M.; Samardzhieva, E.; Zhang, Z.

    2003-04-01

    Earthquakes become major societal risks when they impinge on vulnerable populations. According to the available worldwide data during the twentieth century (NEIC Catalog of Earthquakes 1980-1999), almost half a thousand of earthquakes resulted in more than 1,615,000 human victims. Besides human casualty levels, destructive earthquakes frequently inflict huge economic losses. An additional problem of very different nature, but also worthy of being considered in a damage and loss analysis, is the direct cost associated with the damages derived from a strong seismic impact. We focus our attention on both aspects to their rapid quantitative assessment, and to lessen the earthquake disaster in areas affected by relatively strong earthquakes. Our final goal is the knowledge of potential losses from earthquakes to forward national programs in emergency management, and consequently the minimization of the life loss due to earthquakes, and to aid in response and recovery tasks. For this purpose we follow a suitable and comprehensible methodology for risk-based loss analysis, and simulate the occurence of a seismic event in densely populated areas of Spain.

  6. Seismic Analysis Capability in NASTRAN

    NASA Technical Reports Server (NTRS)

    Butler, T. G.; Strang, R. F.

    1984-01-01

    Seismic analysis is a technique which pertains to loading described in terms of boundary accelerations. Earthquake shocks to buildings is the type of excitation which usually comes to mind when one hears the word seismic, but this technique also applied to a broad class of acceleration excitations which are applied at the base of a structure such as vibration shaker testing or shocks to machinery foundations. Four different solution paths are available in NASTRAN for seismic analysis. They are: Direct Seismic Frequency Response, Direct Seismic Transient Response, Modal Seismic Frequency Response, and Modal Seismic Transient Response. This capability, at present, is invoked not as separate rigid formats, but as pre-packaged ALTER packets to existing RIGID Formats 8, 9, 11, and 12. These ALTER packets are included with the delivery of the NASTRAN program and are stored on the computer as a library of callable utilities. The user calls one of these utilities and merges it into the Executive Control Section of the data deck to perform any of the four options are invoked by setting parameter values in the bulk data.

  7. Review on Rapid Seismic Vulnerability Assessment for Bulk of Buildings

    NASA Astrophysics Data System (ADS)

    Nanda, R. P.; Majhi, D. R.

    2013-09-01

    This paper provides a brief overview of rapid visual screening (RVS) procedures available in different countries with a comparison among all the methods. Seismic evaluation guidelines from, USA, Canada, Japan, New Zealand, India, Europe, Italy, UNDP, with other methods are reviewed from the perspective of their applicability to developing countries. The review shows clearly that some of the RVS procedures are unsuited for potential use in developing countries. It is expected that this comparative assessment of various evaluation schemes will help to identify the most essential components of such a procedure for use in India and other developing countries, which is not only robust, reliable but also easy to use with available resources. It appears that Federal Emergency Management Agency (FEMA) 154 and New Zealand Draft Code approaches can be suitably combined to develop a transparent, reasonably rigorous and generalized procedure for seismic evaluation of buildings in developing countries.

  8. Generalized seismic analysis

    NASA Technical Reports Server (NTRS)

    Butler, Thomas G.

    1993-01-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  9. Generalized seismic analysis

    NASA Astrophysics Data System (ADS)

    Butler, Thomas G.

    1993-09-01

    There is a constant need to be able to solve for enforced motion of structures. Spacecraft need to be qualified for acceleration inputs. Truck cargoes need to be safeguarded from road mishaps. Office buildings need to withstand earthquake shocks. Marine machinery needs to be able to withstand hull shocks. All of these kinds of enforced motions are being grouped together under the heading of seismic inputs. Attempts have been made to cope with this problem over the years and they usually have ended up with some limiting or compromise conditions. The crudest approach was to limit the problem to acceleration occurring only at a base of a structure, constrained to be rigid. The analyst would assign arbitrarily outsized masses to base points. He would then calculate the magnitude of force to apply to the base mass (or masses) in order to produce the specified acceleration. He would of necessity have to sacrifice the determination of stresses in the vicinity of the base, because of the artificial nature of the input forces. The author followed the lead of John M. Biggs by using relative coordinates for a rigid base in a 1975 paper, and again in a 1981 paper . This method of relative coordinates was extended and made operational as DMAP ALTER packets to rigid formats 9, 10, 11, and 12 under contract N60921-82-C-0128. This method was presented at the twelfth NASTRAN Colloquium. Another analyst in the field developed a method that computed the forces from enforced motion then applied them as a forcing to the remaining unknowns after the knowns were partitioned off. The method was translated into DMAP ALTER's but was never made operational. All of this activity jelled into the current effort. Much thought was invested in working out ways to unshakle the analysis of enforced motions from the limitations that persisted.

  10. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Konecny, Pavel; Lednicka, Marketa; Bednárik, Martin; Brimich, Ladislav

    2015-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, but infrastructure costs for addressing these hazards are huge as well. Furthermore, seismic hazard is only one of the many hazards facing society. Societal means need to be distributed in a reasonable manner - to assure that all of these hazards (natural as well as societal) are addressed appropriately. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test PSHA models is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occuring over time intervals that are at least as long as a seismic cycle - if that exists. Such information would be very valuable, even if it concerned only a single site, namely that of a particularly sensitive infrastructure. Such a request may seem hopeless - but it is not. Long-term information can in principle be gained from intact stalagmites in natural caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on case studies in Austria, which has moderate seismicity, but a well-documented history of major earthquake-induced damage, e.g., Villach in 1348 and 1690, Vienna in 1590, Leoben in 1794, and Innsbruck in 1551, 1572, and 1589. Seismic intensities have reached levels up to 10. It is clearly important to know which "worst-case" damages to expect. We have identified sets of particularly sensitive stalagmites in the general vicinity of two major cities in

  11. An Analysis of Botnet Vulnerabilities

    DTIC Science & Technology

    2007-06-01

    significant vulnerabilities were found. While this research does not eliminate the possibility that a critical vulnerability is present in the Unreal...are designed for malicious purposes. Bots are distinguished from other malicious code like viruses and worms by a communication channel linking...vulnerability poses a risk to users connecting to an Unreal IRCd server, but does not suggest possible ways to exploit the server. 2.5 Dynamic DNS A bot

  12. Comparative Application of Capacity Models for Seismic Vulnerability Evaluation of Existing RC Structures

    SciTech Connect

    Faella, C.; Lima, C.; Martinelli, E.; Nigro, E.

    2008-07-08

    Seismic vulnerability assessment of existing buildings is one of the most common tasks in which Structural Engineers are currently engaged. Since, its is often a preliminary step to approach the issue of how to retrofit non-seismic designed and detailed structures, it plays a key role in the successful choice of the most suitable strengthening technique. In this framework, the basic information for both seismic assessment and retrofitting is related to the formulation of capacity models for structural members. Plenty of proposals, often contradictory under the quantitative standpoint, are currently available within the technical and scientific literature for defining the structural capacity in terms of force and displacements, possibly with reference to different parameters representing the seismic response. The present paper shortly reviews some of the models for capacity of RC members and compare them with reference to two case studies assumed as representative of a wide class of existing buildings.

  13. Interactive Vulnerability Analysis Enhancement Results

    DTIC Science & Technology

    2012-12-01

    PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD- MM -YYYY) DECEMBER 2012 2. REPORT TYPE FINAL TECHNICAL REPORT 3. DATES...application. Here is a screenshot of IAST finding a Cross-Site Scripting (XSS) vulnerability in a “ Hello World” Scala application: Approved for Public

  14. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    NASA Astrophysics Data System (ADS)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  15. Uncertainty Management in Seismic Vulnerability Assessment Using Granular Computing Based on Covering of Universe

    NASA Astrophysics Data System (ADS)

    Khamespanah, F.; Delavar, M. R.; Zare, M.

    2013-05-01

    Earthquake is an abrupt displacement of the earth's crust caused by the discharge of strain collected along faults or by volcanic eruptions. Earthquake as a recurring natural cataclysm has always been a matter of concern in Tehran, capital of Iran, as a laying city on a number of known and unknown faults. Earthquakes can cause severe physical, psychological and financial damages. Consequently, some procedures should be developed to assist modelling the potential casualties and its spatial uncertainty. One of these procedures is production of seismic vulnerability maps to take preventive measures to mitigate corporeal and financial losses of future earthquakes. Since vulnerability assessment is a multi-criteria decision making problem depending on some parameters and expert's judgments, it undoubtedly is characterized by intrinsic uncertainties. In this study, it is attempted to use Granular computing (GrC) model based on covering of universe to handle the spatial uncertainty. Granular computing model concentrates on a general theory and methodology for problem solving as well as information processing by assuming multiple levels of granularity. Basic elements in granular computing are subsets, classes, and clusters of a universe called elements. In this research GrC is used for extracting classification rules based on seismic vulnerability with minimum entropy to handle uncertainty related to earthquake data. Tehran was selected as the study area. In our previous research, Granular computing model based on a partition model of universe was employed. The model has some kinds of limitations in defining similarity between elements of the universe and defining granules. In the model similarity between elements is defined based on an equivalence relation. According to this relation, two objects are similar based on some attributes, provided for each attribute the values of these objects are equal. In this research a general relation for defining similarity between

  16. Application of PRA to HEMP vulnerability analysis

    SciTech Connect

    Mensing, R.W.

    1985-09-01

    Vulnerability analyses of large systems, e.g., control and communication centers, aircraft, ships, are subject to many uncertainties. A basic source of uncertainty is the random variation inherent in the physical world. Thus, vulnerability is appropriately described by an estimate of the probability of survival (or failure). A second source of uncertainty that also needs to be recognized is the uncertainty associated with the analysis or estimation process itself. This uncertainty, often called modeling uncertainty, has many contributors. There are the approximations introduced by using mathematical models to describe reality. Also, the appropriate values of the model parameters are derived from several sources, e.g., based on experimental or test data, based on expert judgment and opinion. In any case, these values are subject to uncertainty. This uncertainty must be considered in the description of vulnerability. Thus, the estimate of the probability of survival is not a single value but a range of values. Probabilistic risk analysis (PRA) is a methodology which deals with these uncertainty issues. This report discusses the application of PRA to HEMP vulnerability analyses. Vulnerability analysis and PRA are briefly outlined and the need to distinguish between random variation and modeling uncertainty is discussed. Then a sequence of steps appropriate for applying PRA to vulnerability problems is outlined. Finally, methods for handling modeling uncertainty are identified and discussed.

  17. Seismic vulnerability and damage of Italian historical centres: A case study in the Campania region

    NASA Astrophysics Data System (ADS)

    Formisano, Antonio; Chieffo, Nicola; Fabbrocino, Francesco; Landolfo, Raffaele

    2017-07-01

    The preservation of masonry buildings typical of Italian historical centres represents a very pressing dilemma founded on recovery need of the urban fabric original character. In the paper, based on a methodology developed by some of the Authors on building aggregates, the seismic vulnerability estimation of some masonry compounds in the heart of the town of San PotitoSannitico (Caserta, Italy) is presented and compared to the results achieved from applying the basic literature method for isolated constructions. Finally, the damage scenario of inspected buildings has been shown by highlighting clearly the influence of different positions of structural units on the damages that masonry aggregates suffer under different grade earthquakes, leading to individuate the most vulnerable buildings.

  18. Measuring Road Network Vulnerability with Sensitivity Analysis

    PubMed Central

    Jun-qiang, Leng; Long-hai, Yang; Liu, Wei-yi; Zhao, Lin

    2017-01-01

    This paper focuses on the development of a method for road network vulnerability analysis, from the perspective of capacity degradation, which seeks to identify the critical infrastructures in the road network and the operational performance of the whole traffic system. This research involves defining the traffic utility index and modeling vulnerability of road segment, route, OD (Origin Destination) pair and road network. Meanwhile, sensitivity analysis method is utilized to calculate the change of traffic utility index due to capacity degradation. This method, compared to traditional traffic assignment, can improve calculation efficiency and make the application of vulnerability analysis to large actual road network possible. Finally, all the above models and calculation method is applied to actual road network evaluation to verify its efficiency and utility. This approach can be used as a decision-supporting tool for evaluating the performance of road network and identifying critical infrastructures in transportation planning and management, especially in the resource allocation for mitigation and recovery. PMID:28125706

  19. Uncertainty analysis in seismic tomography

    NASA Astrophysics Data System (ADS)

    Owoc, Bartosz; Majdański, Mariusz

    2017-04-01

    Velocity field from seismic travel time tomography depends on several factors like regularization, inversion path, model parameterization etc. The result also strongly depends on an initial velocity model and precision of travel times picking. In this research we test dependence on starting model in layered tomography and compare it with effect of picking precision. Moreover, in our analysis for manual travel times picking the uncertainty distribution is asymmetric. This effect is shifting the results toward faster velocities. For calculation we are using JIVE3D travel time tomographic code. We used data from geo-engineering and industrial scale investigations, which were collected by our team from IG PAS.

  20. Vulnerability

    NASA Technical Reports Server (NTRS)

    Taback, I.

    1979-01-01

    The discussion of vulnerability begins with a description of some of the electrical characteristics of fibers before definiting how vulnerability calculations are done. The vulnerability results secured to date are presented. The discussion touches on post exposure vulnerability. After a description of some shock hazard work now underway, the discussion leads into a description of the planned effort and some preliminary conclusions are presented.

  1. Information systems vulnerability: A systems analysis perspective

    SciTech Connect

    Wyss, G.D.; Daniel, S.L.; Schriner, H.K.; Gaylor, T.R.

    1996-07-01

    Vulnerability analyses for information systems are complicated because the systems are often geographically distributed. Sandia National Laboratories has assembled an interdisciplinary team to explore the applicability of probabilistic logic modeling (PLM) techniques (including vulnerability and vital area analysis) to examine the risks associated with networked information systems. The authors have found that the reliability and failure modes of many network technologies can be effectively assessed using fault trees and other PLM methods. The results of these models are compatible with an expanded set of vital area analysis techniques that can model both physical locations and virtual (logical) locations to identify both categories of vital areas simultaneously. These results can also be used with optimization techniques to direct the analyst toward the most cost-effective security solution.

  2. Seismic vulnerability of leaning masonry towers located in Emilia-Romagna region, Italy:FE analyses of four case studies

    NASA Astrophysics Data System (ADS)

    Milani, Gabriele; Shehu, Rafael; Valente, Marco

    2016-12-01

    Four inclined masonry towers are investigated in this paper in order to study the behavior under horizontal loads and the role of inclination on the seismic vulnerability. The towers are located in the North-East of Italy, a moderate seismicity region that was recently stricken by an earthquake with two major seismic events of magnitude 5.8 and 5.9. These towers date back to four to nine centuries ago and are well representative of the towers of the region. They present a meaningful inclination cumulated over years, which has a significant influence on the bearing capacity under lateral loads. Some retrofitting interventions were recently carried out by introducing tendons and hooping systems in order to ensure a box behavior and preclude the spreading of dangerous cracks due to the insufficient tensile strength of the masonry material. The structural behavior of the towers under horizontal loads is influenced by some geometrical issues, such as slenderness, walls thickness, perforations, irregularities, but the main aim of the paper is to provide an insight on the role played by inclination. The case studies are chosen to exhibit different values of slenderness in order to include a large range of geometrical cases for the assessment of the seismic vulnerability of the towers. Numerical analyses are carried out considering the effects of the retrofitting interventions too. As expected, pushover analyses show that inclination may increase the seismic vulnerability of the masonry towers comparing the results obtained for the inclined real case and the hypothetical vertical case.

  3. Uncertainty Analysis and Expert Judgment in Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2011-01-01

    The large uncertainty associated with the prediction of future earthquakes is usually regarded as the main reason for increased hazard estimates which have resulted from some recent large scale probabilistic seismic hazard analysis studies (e.g. the PEGASOS study in Switzerland and the Yucca Mountain study in the USA). It is frequently overlooked that such increased hazard estimates are characteristic for a single specific method of probabilistic seismic hazard analysis (PSHA): the traditional (Cornell-McGuire) PSHA method which has found its highest level of sophistication in the SSHAC probability method. Based on a review of the SSHAC probability model and its application in the PEGASOS project, it is shown that the surprising results of recent PSHA studies can be explained to a large extent by the uncertainty model used in traditional PSHA, which deviates from the state of the art in mathematics and risk analysis. This uncertainty model, the Ang-Tang uncertainty model, mixes concepts of decision theory with probabilistic hazard assessment methods leading to an overestimation of uncertainty in comparison to empirical evidence. Although expert knowledge can be a valuable source of scientific information, its incorporation into the SSHAC probability method does not resolve the issue of inflating uncertainties in PSHA results. Other, more data driven, PSHA approaches in use in some European countries are less vulnerable to this effect. The most valuable alternative to traditional PSHA is the direct probabilistic scenario-based approach, which is closely linked with emerging neo-deterministic methods based on waveform modelling.

  4. Seismic analysis of the LSST telescope

    NASA Astrophysics Data System (ADS)

    Neill, Douglas R.

    2012-09-01

    The Large Synoptic Survey Telescope (LSST) will be located on the seismically active Chilean mountain of Cerro Pachón. The accelerations resulting from seismic events produce the most demanding load cases the telescope and its components must withstand. Seismic ground accelerations were applied to a comprehensive finite element analysis (FEA) model which included the telescope, its pier and the mountain top. Response accelerations for specific critical components (camera and secondary mirror assembly) on the telescope were determined by applying seismic accelerations in the form of Power Spectral Densities (PSD) to the FEA model. The PSDs were chosen based on the components design lives. Survival level accelerations were determined utilizing PSDs for seismic events with return periods 10 times the telescope's design life which is equivalent to a 10% chance of occurring over the lifetime. Since the telescope has a design life of 30 years it was analyzed for a return period of 300 years. Operational level seismic accelerations were determined using return periods of 5 times the lifetimes. Since the seismic accelerations provided by the Chilean design codes were provided in the form of Peak Spectral Accelerations (PSA), a method to convert between the two forms was developed. The accelerations are also affected by damping level. The LSST incorporates added damping to meets its rapid slew and settle requirements. This added damping also reduces the components' seismic accelerations. The analysis was repeated for the telescope horizon and zenith pointing. Closed form solutions were utilized to verify the results.

  5. Aircraft vulnerability analysis by modeling and simulation

    NASA Astrophysics Data System (ADS)

    Willers, Cornelius J.; Willers, Maria S.; de Waal, Alta

    2014-10-01

    Infrared missiles pose a significant threat to civilian and military aviation. ManPADS missiles are especially dangerous in the hands of rogue and undisciplined forces. Yet, not all the launched missiles hit their targets; the miss being either attributable to misuse of the weapon or to missile performance restrictions. This paper analyses some of the factors affecting aircraft vulnerability and demonstrates a structured analysis of the risk and aircraft vulnerability problem. The aircraft-missile engagement is a complex series of events, many of which are only partially understood. Aircraft and missile designers focus on the optimal design and performance of their respective systems, often testing only in a limited set of scenarios. Most missiles react to the contrast intensity, but the variability of the background is rarely considered. Finally, the vulnerability of the aircraft depends jointly on the missile's performance and the doctrine governing the missile's launch. These factors are considered in a holistic investigation. The view direction, altitude, time of day, sun position, latitude/longitude and terrain determine the background against which the aircraft is observed. Especially high gradients in sky radiance occur around the sun and on the horizon. This paper considers uncluttered background scenes (uniform terrain and clear sky) and presents examples of background radiance at all view angles across a sphere around the sensor. A detailed geometrical and spatially distributed radiometric model is used to model the aircraft. This model provides the signature at all possible view angles across the sphere around the aircraft. The signature is determined in absolute terms (no background) and in contrast terms (with background). It is shown that the background significantly affects the contrast signature as observed by the missile sensor. A simplified missile model is constructed by defining the thrust and mass profiles, maximum seeker tracking rate, maximum

  6. Seismic analysis of nuclear power plant structures

    NASA Technical Reports Server (NTRS)

    Go, J. C.

    1973-01-01

    Primary structures for nuclear power plants are designed to resist expected earthquakes of the site. Two intensities are referred to as Operating Basis Earthquake and Design Basis Earthquake. These structures are required to accommodate these seismic loadings without loss of their functional integrity. Thus, no plastic yield is allowed. The application of NASTRAN in analyzing some of these seismic induced structural dynamic problems is described. NASTRAN, with some modifications, can be used to analyze most structures that are subjected to seismic loads. A brief review of the formulation of seismic-induced structural dynamics is also presented. Two typical structural problems were selected to illustrate the application of the various methods of seismic structural analysis by the NASTRAN system.

  7. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has begun using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. Assess analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weakness of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weakness, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  8. Vulnerability assessment using two complementary analysis tools

    SciTech Connect

    Paulus, W.K.

    1993-07-01

    To analyze the vulnerability of nuclear materials to theft or sabotage, Department of Energy facilities have been using, since 1989, a computer program called ASSESS, Analytic System and Software for Evaluation of Safeguards and Security. During the past year Sandia National Laboratories has began using an additional program, SEES, Security Exercise Evaluation Simulation, enhancing the picture of vulnerability beyond what either program achieves alone. ASSESS analyzes all possible paths of attack on a target and, assuming that an attack occurs, ranks them by the probability that a response force of adequate size can interrupt the attack before theft or sabotage is accomplished. A Neutralization module pits, collectively, a security force against the interrupted adversary force in a fire fight and calculates the probability that the adversaries are defeated. SEES examines a single scenario and simulates in detail the interactions among all combatants. Its output includes shots fired between shooter and target, and the hits and kills. Whereas ASSESS gives breadth of analysis, expressed statistically and performed relatively quickly, SEES adds depth of detail, modeling tactical behavior. ASSESS finds scenarios that exploit the greatest weaknesses of a facility. SEES explores these scenarios to demonstrate in detail how various tactics to nullify the attack might work out. Without ASSESS to find the facility weaknesses, it is difficult to focus SEES objectively on scenarios worth analyzing. Without SEES to simulate the details of response vs. adversary interaction, it is not possible to test tactical assumptions and hypotheses. Using both programs together, vulnerability analyses achieve both breadth and depth.

  9. A pragmatic analysis of vulnerability in clinical research.

    PubMed

    Wendler, David

    2017-09-01

    Identifying which subjects are vulnerable, and implementing safeguards to protect them, is widely regarded as essential to clinical research. Commentators have endorsed a number of responses to these challenges and have thereby made significant progress in understanding vulnerability in clinical research. At the same time, this literature points to a central contradiction which calls into question its potential to protect vulnerable subjects in practice. Specifically, analysis suggests that all human subjects are vulnerable and vulnerability in clinical research is comparative and context dependent, in the sense that individuals are vulnerable relative to others and in some contexts only. Yet, if everyone is vulnerable, there seems to be no point in citing the vulnerability of some individuals. Moreover, the conclusion that everyone is vulnerable seems inconsistent with the claims that vulnerability is comparative and context dependent, raising concern over whether it will be possible to develop a comprehensive account of vulnerability that is internally consistent. The solution to this dilemma lies in recognition of the fact that the practical significance of claims regarding vulnerability depends on the context in which they are used. The claims that appear to lead to the central contradiction are in fact accurate conclusions that follow from different uses of the term 'vulnerability'. The present manuscript describes this 'pragmatic' approach to vulnerability in clinical research and considers its implications for ensuring that subjects receive appropriate protection. Published 2017. This article is a U.S. Government work and is in the public domain in the USA.

  10. How to simulate pedestrian behaviors in seismic evacuation for vulnerability reduction of existing buildings

    NASA Astrophysics Data System (ADS)

    Quagliarini, Enrico; Bernardini, Gabriele; D'Orazio, Marco

    2017-07-01

    Understanding and representing how individuals behave in earthquake emergencies would be essentially to assess the impact of vulnerability reduction strategies on existing buildings in seismic areas. In fact, interactions between individuals and the scenario (modified by the earthquake occurrence) are really important in order to understand the possible additional risks for people, especially during the evacuation phase. The current approach is based on "qualitative" aspects, in order to define best practice guidelines for Civil Protection and populations. On the contrary, a "quantitative" description of human response and evacuation motion in similar conditions is urgently needed. Hence, this work defines the rules for pedestrians' earthquake evacuation in urban scenarios, by taking advantages of previous results of real-world evacuation analyses. In particular, motion laws for pedestrians is defined by modifying the Social Force model equation. The proposed model could be used for evaluating individuals' evacuation process and so for defining operative strategies for interferences reduction in critical urban fabric parts (e.g.: interventions on particular buildings, evacuation strategies definition, city parts projects).

  11. a Novel Approach to Support Majority Voting in Spatial Group Mcdm Using Density Induced Owa Operator for Seismic Vulnerability Assessment

    NASA Astrophysics Data System (ADS)

    Moradi, M.; Delavar, M. R.; Moshiri, B.; Khamespanah, F.

    2014-10-01

    Being one of the most frightening disasters, earthquakes frequently cause huge damages to buildings, facilities and human beings. Although the prediction of characteristics of an earthquake seems to be impossible, its loss and damage is predictable in advance. Seismic loss estimation models tend to evaluate the extent to which the urban areas are vulnerable to earthquakes. Many factors contribute to the vulnerability of urban areas against earthquakes including age and height of buildings, the quality of the materials, the density of population and the location of flammable facilities. Therefore, seismic vulnerability assessment is a multi-criteria problem. A number of multi criteria decision making models have been proposed based on a single expert. The main objective of this paper is to propose a model which facilitates group multi criteria decision making based on the concept of majority voting. The main idea of majority voting is providing a computational tool to measure the degree to which different experts support each other's opinions and make a decision regarding this measure. The applicability of this model is examined in Tehran metropolitan area which is located in a seismically active region. The results indicate that neglecting the experts which get lower degrees of support from others enables the decision makers to avoid the extreme strategies. Moreover, a computational method is proposed to calculate the degree of optimism in the experts' opinions.

  12. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites from Vacska cave, Pilis Mountains of Hungary

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Kovács, Károly; Mónus, Péter; Konecny, Pavel; Lednicka, Marketa; Novák, Attila

    2017-04-01

    Damaging earthquakes in central Europe are infrequent, but do occur. This raises the important issue for society of how to react to this hazard: potential damages are enormous, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased expert knowledge of the seismic hazard (and risk) is therefore very important. Seismic activity in the Pannonian Basin is moderate. In territories with low or moderate seismic activity the recurrence time of large earthquakes can be as long as 10,000 years. Therefore, we cannot draw well-grounded inferences in the field of seismic hazard assessment exclusively from the seismic data of 1,000- to 2,000-years observational period, that we have in our earthquake catalogues. Long-term information can be gained from intact and vulnerable stalagmites (IVSTM) in natural karstic caves. These fragile formations survived all earthquakes that have occurred, over thousands of years - depending on the age of them. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that time period. Here we present such a stalagmite-based case study from the Pilis Mountains of Hungary. Evidence of historic events and of differential uplifting (incision of Danube at the River Bend and in Buda and Gerecse Hills) exists in the vicinity of investigated cave site. These observations imply that a better understanding of possible co-seismic ground motions in the nearby densely populated areas of Budapest is needed. A specially shaped (high, slim and more or less cylindrical form), intact and vulnerable stalagmites in the Vacska cave, Pilis Mountains were examined. The method of our investigation includes in-situ examination of the IVSTM and mechanical laboratory measurements of broken stalagmite samples. The used approach can yield significant new constraints on the seismic hazard of the investigated area, since tectonic structures close to Vacska cave could not have

  13. Seismic Hazard Analysis — Quo vadis?

    NASA Astrophysics Data System (ADS)

    Klügel, Jens-Uwe

    2008-05-01

    The paper is dedicated to the review of methods of seismic hazard analysis currently in use, analyzing the strengths and weaknesses of different approaches. The review is performed from the perspective of a user of the results of seismic hazard analysis for different applications such as the design of critical and general (non-critical) civil infrastructures, technical and financial risk analysis. A set of criteria is developed for and applied to an objective assessment of the capabilities of different analysis methods. It is demonstrated that traditional probabilistic seismic hazard analysis (PSHA) methods have significant deficiencies, thus limiting their practical applications. These deficiencies have their roots in the use of inadequate probabilistic models and insufficient understanding of modern concepts of risk analysis, as have been revealed in some recent large scale studies. These deficiencies result in the lack of ability of a correct treatment of dependencies between physical parameters and finally, in an incorrect treatment of uncertainties. As a consequence, results of PSHA studies have been found to be unrealistic in comparison with empirical information from the real world. The attempt to compensate these problems by a systematic use of expert elicitation has, so far, not resulted in any improvement of the situation. It is also shown that scenario-earthquakes developed by disaggregation from the results of a traditional PSHA may not be conservative with respect to energy conservation and should not be used for the design of critical infrastructures without validation. Because the assessment of technical as well as of financial risks associated with potential damages of earthquakes need a risk analysis, current method is based on a probabilistic approach with its unsolved deficiencies. Traditional deterministic or scenario-based seismic hazard analysis methods provide a reliable and in general robust design basis for applications such as the design

  14. Seismic Analysis of Intake Towers

    DTIC Science & Technology

    1982-10-01

    AREA & WORK UNIT NUMBERS Structures Laboratory Structural Engineering P. 0. Box 631, Vicksburg, Miss. 39180 Research Work Unit 31588 It. CONTROLLING ...AGENCY NAME & ADDRESS(if ditffrsnt frost Controlling Office) IS. SECURITY CLASS. (of this report) Unclassified I4a. DECL ASSIFI CATION/DOWN GRADI NO...needed for a controlled release of the reservoir to repair any seismic damage in the damming structure. The high cost associated with these criteria for a

  15. Method and tool for network vulnerability analysis

    DOEpatents

    Swiler, Laura Painton; Phillips, Cynthia A.

    2006-03-14

    A computer system analysis tool and method that will allow for qualitative and quantitative assessment of security attributes and vulnerabilities in systems including computer networks. The invention is based on generation of attack graphs wherein each node represents a possible attack state and each edge represents a change in state caused by a single action taken by an attacker or unwitting assistant. Edges are weighted using metrics such as attacker effort, likelihood of attack success, or time to succeed. Generation of an attack graph is accomplished by matching information about attack requirements (specified in "attack templates") to information about computer system configuration (contained in a configuration file that can be updated to reflect system changes occurring during the course of an attack) and assumed attacker capabilities (reflected in "attacker profiles"). High risk attack paths, which correspond to those considered suited to application of attack countermeasures given limited resources for applying countermeasures, are identified by finding "epsilon optimal paths."

  16. Static analysis assisted vulnerability-oriented evolutionary fuzzing

    NASA Astrophysics Data System (ADS)

    Yang, Gang; Feng, Chao; Tang, Chaojing

    2017-03-01

    The blindness of fuzzing test is the main reason for its inefficiency in practical vulnerability discovery. In this paper, we proposed a static analysis assisted and vulnerability-oriented fuzzing testing technology. Through the static analysis, the suspect vulnerable code areas can be located roughly. And combined with dynamics update based on the fuzzing feedback, the vulnerable code areas can be further accurately located. By applying the distances to the located vulnerable code areas as one of metrics of the evolutionary fuzzing test, the vulnerability-oriented test cases are generated. The results in the experiment showed that the method proposed in this paper could effectually improve the vulnerability discovering efficiency of fuzzing test.

  17. Verification the data on critical facilities inventory and vulnerability for seismic risk assessment taking into account possible accidents

    NASA Astrophysics Data System (ADS)

    Frolova, Nina; Larionov, Valery; Bonnin, Jean; Ugarov, Aleksander

    2015-04-01

    The paper contains the results of the recent study that has been done by Seismological Center of IGE, Russian Academy of Sciences and Extreme Situations Research Center within the Russian Academy of Sciences Project "Theoretical and Methodological basis for seismic risk assessment taking into account technological accidents at local level; constructing the seismic risk maps for the Big Sochi City territory including the venue of Olympic Games facilities." The procedure of critical facilities inventory and vulnerability verification which makes use of space images and web technologies in social networks is presented. The numerical values of the criteria of accidents at fire and chemical hazardous facilities triggered by strong earthquakes are obtained. The seismic risk maps for Big Sochi City territory including the Olympic Games venue constructed taking into account new data on critical facilities obtained with application panorama photos of these facilities, space images of high resolution and web technologies. The obtained values of individual seismic risk taking into account secondary technological accidents exceed the values seismic risk without taking secondary hazard, return period T= 500 years, at 0.5-1.0 10-51/year.

  18. Seismic analysis of a vacuum vessel

    SciTech Connect

    Chen, W.W.

    1993-01-01

    This paper presents the results of the seismic analysis for the preliminary design of a vacuum vessel for the ground engineering system (GES) of the SP-100 project. It describes the method of calculating the elevated seismic response spectra at various levels within the vacuum vessel using the simplified computer code developed by Weiner. A modal superposition analysis under design response spectra loading was performed for a three-dimensional finite-element model using the general-purpose finite-element computer code ANSYS. The in-vessel elevated seismic response spectra at various levels in the vacuum vessel, along with vessel mode shapes and frequencies are presented. Also included are descriptions of the results of the modal analyses for some significant preliminary design points at various elevations of the vessel.

  19. AVVAM-1 (Armored Vehicle Vulnerability Analysis Model) and Tank Vulnerability Sensitivity Studies

    DTIC Science & Technology

    1973-01-01

    Ground , Maryland 0 Introduction I AVVAM-I ( Armored Vehicle Vulnerability Analysis Model -first version) is a conceptuil model-and associateS digital... measure of the ballistic shielding provided by a tank component. *The higher this value the harder it is for a spall fragment to perforate the component...in vulnerability to ballistic damage from behind-the- armor frag- ments and in the ability of noncritical components to provide ballistic shielding for

  20. Transverse seismic analysis of buried pipelines

    SciTech Connect

    Mavridis, G.A.; Pitilakis, K.D.

    1995-12-31

    The objective of this study is to develop an analytical procedure for calculating upper bounds for stresses and strains for the case of the transverse seismic shaking of continuous buried pipelines taking into account the soil-pipeline interaction effects. A sensibility analysis of some critical parameters is performed. The influence of various parameters such as the apparent propagation velocity, the frequency content of the seismic ground excitation, the dynamic soil properties, the pipe`s material and size, on the ratio of the pipe to ground displacements amplitudes and consequently to the induced pipe strains, are studied parametrically.

  1. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2016-12-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  2. Quantitative Analysis of Seismicity in Iran

    NASA Astrophysics Data System (ADS)

    Raeesi, Mohammad; Zarifi, Zoya; Nilfouroushan, Faramarz; Boroujeni, Samar Amini; Tiampo, Kristy

    2017-03-01

    We use historical and recent major earthquakes and GPS geodetic data to compute seismic strain rate, geodetic slip deficit, static stress drop, the parameters of the magnitude-frequency distribution and geodetic strain rate in the Iranian Plateau to identify seismically mature fault segments and regions. Our analysis suggests that 11 fault segments are in the mature stage of the earthquake cycle, with the possibility of generating major earthquakes. These faults primarily are located in the north and the east of Iran. Four seismically mature regions in southern Iran with the potential for damaging strong earthquakes are also identified. We also delineate four additional fault segments in Iran that can generate major earthquakes without robust clues to their maturity.The most important fault segment in this study is the strike-slip system near the capital city of Tehran, with the potential to cause more than one million fatalities.

  3. RSEIS and RFOC: Seismic Analysis in R

    NASA Astrophysics Data System (ADS)

    Lees, J. M.

    2015-12-01

    Open software is essential for reproducible scientific exchange. R-packages provide a platform for development of seismological investigation software that can be properly documented and traced for data processing. A suite of R packages designed for a wide range of seismic analysis is currently available in the free software platform called R. R is a software platform based on the S-language developed at Bell Labs decades ago. Routines in R can be run as standalone function calls, or developed in object-oriented mode. R comes with a base set of routines, and thousands of user developed packages. The packages developed at UNC include subroutines and interactive codes for processing seismic data, analyzing geographic information (GIS) and inverting data involved in a variety of geophysical applications. On CRAN (Comprehensive R Archive Network, http://www.r-project.org/) currently available packages related to seismic analysis are RSEIS, Rquake, GEOmap, RFOC, zoeppritz, RTOMO, and geophys, Rwave, PEIP, hht, rFDSN. These include signal processing, data management, mapping, earthquake location, deconvolution, focal mechanisms, wavelet transforms, Hilbert-Huang Transforms, tomographic inversion, and Mogi deformation among other useful functionality. All software in R packages is required to have detailed documentation, making the exchange and modification of existing software easy. In this presentation, I will focus on packages RSEIS and RFOC, showing examples from a variety of seismic analyses. The R approach has similarities to the popular (and expensive) MATLAB platform, although R is open source and free to down load.

  4. Reservoir permeability from seismic attribute analysis

    SciTech Connect

    Silin, Dmitriy; Goloshubin, G.; Silin, D.; Vingalov, V.; Takkand, G.; Latfullin, M.

    2008-02-15

    In case of porous fluid-saturated medium the Biot's poroelasticity theory predicts a movement of the pore fluid relative to the skeleton on seismic wave propagation through the medium. This phenomenon opens an opportunity for investigation of the flow properties of the hydrocarbon-saturated reservoirs. It is well known that relative fluid movement becomes negligible at seismic frequencies if porous material is homogeneous and well cemented. In this case the theory predicts an underestimated seismic wave velocity dispersion and attenuation. Based on Biot's theory, Helle et al. (2003) have numerically demonstrated the substantial effects on both velocity and attenuation by heterogeneous permeability and saturation in the rocks. Besides fluid flow effect, the effects of scattering (Gurevich, et al., 1997) play very important role in case of finely layered porous rocks and heterogeneous fluid saturation. We have used both fluid flow and scattering effects to derive a frequency-dependent seismic attribute which is proportional to fluid mobility and applied it for analysis of reservoir permeability.

  5. Seismic hazard analysis of the Adelaide region, South Australia

    NASA Astrophysics Data System (ADS)

    Setiawan, Bambang

    2017-07-01

    Seismic activity in Australia is categorised as low to moderate. However, the rate of the deformation of the Australian continent is faster than other stable intraplate regions, such as Eastern North America and Africa. Adelaide region is the most seismically active zone in the Australian continent. Therefore, seismic hazard analysis of the Adelaide region is needed to improve the accuracy of seismic hazard predictions. Probabilistic seismic hazard analysis (PSHA) incorporating Monte Carlo simulation is selected in the present paper. This method has several advantages i.e. simple, fast, flexible, and robust. Results of the analysis suggest comparable results with previous studies. Furthermore, two main threats are identified in the de-aggregation for the Adelaide city. Due to the limitation of strong magnitude event and the scarcity of the seismic event, further work is suggested for improving the estimates by extending the seismic catalogue i.e. incorporating neo-tectonic and paleo-seismic studies.

  6. Seismic Vulnerability Assessment Waste Characterization Reduction and Repackaging Building, TA-50-69

    SciTech Connect

    M.W.Sullivan; J.Ruminer; I.Cuesta

    2003-02-02

    This report presents the results of the seismic structural analyses completed on the Waste Characterization Reduction and Repackaging (WCRR) Building in support of ongoing safety analyses. WCRR is designated as TA-50-69 at Los Alamos National Laboratory, Los Alamos, New Mexico. The facility has been evaluated against Department of Energy (DOE) seismic criteria for Natural Phenomena Hazards (NPH) Performance Category II (PC 2). The seismic capacities of two subsystems within the WCRR building, the material handling glove box and the lift rack immediately adjacent to the Glove Box are also documented, and the results are presented.

  7. Global analysis of urban surface water supply vulnerability

    NASA Astrophysics Data System (ADS)

    Padowski, Julie C.; Gorelick, Steven M.

    2014-10-01

    This study presents a global analysis of urban water supply vulnerability in 71 surface-water supplied cities, with populations exceeding 750 000 and lacking source water diversity. Vulnerability represents the failure of an urban supply-basin to simultaneously meet demands from human, environmental and agricultural users. We assess a baseline (2010) condition and a future scenario (2040) that considers increased demand from urban population growth and projected agricultural demand. We do not account for climate change, which can potentially exacerbate or reduce urban supply vulnerability. In 2010, 35% of large cities are vulnerable as they compete with agricultural users. By 2040, without additional measures 45% of cities are vulnerable due to increased agricultural and urban demands. Of the vulnerable cities in 2040, the majority are river-supplied with mean flows so low (1200 liters per person per day, l/p/d) that the cities experience ‘chronic water scarcity’ (1370 l/p/d). Reservoirs supply the majority of cities facing individual future threats, revealing that constructed storage potentially provides tenuous water security. In 2040, of the 32 vulnerable cities, 14 would reduce their vulnerability via reallocating water by reducing environmental flows, and 16 would similarly benefit by transferring water from irrigated agriculture. Approximately half remain vulnerable under either potential remedy.

  8. Masonry Infilling Effect On Seismic Vulnerability and Performance Level of High Ductility RC Frames

    SciTech Connect

    Ghalehnovi, M.; Shahraki, H.

    2008-07-08

    In last years researchers preferred behavior-based design of structure to force-based one for designing and construction of the earthquake-resistance structures, this method is named performance based designing. The main goal of this method is designing of structure members for a certain performance or behavior. On the other hand in most of buildings, load bearing frames are infilled with masonry materials which leads to considerable changes in mechanical properties of frames. But usually infilling wall's effect has been ignored in nonlinear analysis of structures because of complication of the problem and lack of simple logical solution. As a result lateral stiffness, strength, ductility and performance of the structure will be computed with less accuracy. In this paper by use of Smooth hysteretic model for masonry infillings, some high ductile RC frames (4, 8 stories including 1, 2 and 3 spans) designed according to Iranian code are considered. They have been analyzed by nonlinear dynamic method in two states, with and without infilling. Then their performance has been determined with criteria of ATC 40 and compared with recommended performance in Iranian seismic code (standard No. 2800)

  9. Masonry Infilling Effect On Seismic Vulnerability and Performance Level of High Ductility RC Frames

    NASA Astrophysics Data System (ADS)

    Ghalehnovi, M.; Shahraki, H.

    2008-07-01

    In last years researchers preferred behavior-based design of structure to force-based one for designing and construction of the earthquake-resistance structures, this method is named performance based designing. The main goal of this method is designing of structure members for a certain performance or behavior. On the other hand in most of buildings, load bearing frames are infilled with masonry materials which leads to considerable changes in mechanical properties of frames. But usually infilling wall's effect has been ignored in nonlinear analysis of structures because of complication of the problem and lack of simple logical solution. As a result lateral stiffness, strength, ductility and performance of the structure will be computed with less accuracy. In this paper by use of Smooth hysteretic model for masonry infillings, some high ductile RC frames (4, 8 stories including 1, 2 and 3 spans) designed according to Iranian code are considered. They have been analyzed by nonlinear dynamic method in two states, with and without infilling. Then their performance has been determined with criteria of ATC 40 and compared with recommended performance in Iranian seismic code (standard No. 2800).

  10. Seismic vulnerability of the Himalayan half-dressed rubble stone masonry structures, experimental and analytical studies

    NASA Astrophysics Data System (ADS)

    Ahmad, N.; Ali, Q.; Ashraf, M.; Alam, B.; Naeem, A.

    2012-11-01

    Half-Dressed rubble stone (DS) masonry structures as found in the Himalayan region are investigated using experimental and analytical studies. The experimental study included a shake table test on a one-third scaled structural model, a representative of DS masonry structure employed for public critical facilities, e.g. school buildings, offices, health care units, etc. The aim of the experimental study was to understand the damage mechanism of the model, develop damage scale towards deformation-based assessment and retrieve the lateral force-deformation response of the model besides its elastic dynamic properties, i.e. fundamental vibration period and elastic damping. The analytical study included fragility analysis of building prototypes using a fully probabilistic nonlinear dynamic method. The prototypes are designed as SDOF systems assigned with lateral, force-deformation constitutive law (obtained experimentally). Uncertainties in the constitutive law, i.e. lateral stiffness, strength and deformation limits, are considered through random Monte Carlo simulation. Fifty prototype buildings are analyzed using a suite of ten natural accelerograms and an incremental dynamic analysis technique. Fragility and vulnerability functions are derived for the damageability assessment of structures, economic loss and casualty estimation during an earthquake given the ground shaking intensity, essential within the context of risk assessment of existing stock aiming towards risk mitigation and disaster risk reduction.

  11. Analysis of seismic events in and near Kuwait

    SciTech Connect

    Harris, D B; Mayeda, K M; Rodgers, A J; Ruppert, S D

    1999-05-11

    Seismic data for events in and around Kuwait were collected and analyzed. The authors estimated event moment, focal mechanism and depth by waveform modeling. Results showed that reliable seismic source parameters for events in and near Kuwait can be estimated from a single broadband three-component seismic station. This analysis will advance understanding of earthquake hazard in Kuwait.

  12. Vulnerability analysis of interdependent infrastructure systems: A methodological framework

    NASA Astrophysics Data System (ADS)

    Wang, Shuliang; Hong, Liu; Chen, Xueguang

    2012-06-01

    Infrastructure systems such as power and water supplies make up the cornerstone of modern society which is essential for the functioning of a society and its economy. They become more and more interconnected and interdependent with the development of scientific technology and social economy. Risk and vulnerability analysis of interdependent infrastructures for security considerations has become an important subject, and some achievements have been made in this area. Since different infrastructure systems have different structural and functional properties, there is no universal all-encompassing 'silver bullet solution' to the problem of analyzing the vulnerability associated with interdependent infrastructure systems. So a framework of analysis is required. This paper takes the power and water systems of a major city in China as an example and develops a framework for the analysis of the vulnerability of interdependent infrastructure systems. Four interface design strategies based on distance, betweenness, degree, and clustering coefficient are constructed. Then two types of vulnerability (long-term vulnerability and focused vulnerability) are illustrated and analyzed. Finally, a method for ranking critical components in interdependent infrastructures is given for protection purposes. It is concluded that the framework proposed here is useful for vulnerability analysis of interdependent systems and it will be helpful for the system owners to make better decisions on infrastructure design and protection.

  13. Space Station Program threat and vulnerability analysis

    NASA Technical Reports Server (NTRS)

    Van Meter, Steven D.; Veatch, John D.

    1987-01-01

    An examination has been made of the physical security of the Space Station Program at the Kennedy Space Center in a peacetime environment, in order to furnish facility personnel with threat/vulnerability information. A risk-management approach is used to prioritize threat-target combinations that are characterized in terms of 'insiders' and 'outsiders'. Potential targets were identified and analyzed with a view to their attractiveness to an adversary, as well as to the consequentiality of the resulting damage.

  14. Information Systems Vulnerability: A Systems Analysis Perspective

    DTIC Science & Technology

    1996-06-01

    viewgraph, video ,electronicposting(Internet, WorldWideWeb, external network), etc.) Information S stems Vulnerability : A S st . .Ctlve Sandiaauthor...12614 (8815) for videos ; Interactive Media 12616 (8815) for multimedia products or technical artwork; Corporate Exhibits 12613 (8815) for exhibits.) DOE...Conference Paper (3 copies) ❑ Exhibit/Display/Poster ❑ Publication ❑ SlidesNiewgraphs ❑ .Audio/ Video /Film ❑ Electronic Posting ❑ Journal Article

  15. Pattern dynamics analysis of seismic catalogs

    NASA Astrophysics Data System (ADS)

    Tiampo, K.; Rundle, J.; Klein, W.; McGinnis, S.; Posadas, A.; Fernàndez, J.; Luzòn, F.

    2003-04-01

    The historical earthquake record, while not complete, spans hundreds to thousands of years of human history. As a result, large, extended fault systems such as those in California are known to demonstrate complex space-time seismicity patterns, which include, but are not limited to, repetitive events, precursory activity and quiescence, and aftershock sequences ((Mogi, 1969; Keilis-Borok et al., 1980; Kanamori, 1981; Kagan and Jackson, 1992; Saleur et al., 1996; Ellsworth and Cole, 1997; Pollitz and Sacks, 1997; Bowman et al., 1998; Nanjo et al., 1998; Wyss and Wiemer, 1999). Although the characteristics of these patterns can be qualitatively described, a systematic quantitative analysis remains elusive (Kanamori, 1981; Turcotte, 1991; Geller et al., 1997). Here we describe a new technique, formulated based on new developments in the physical and theoretical understanding of these complex, nonlinear fault systems that isolates emergent regions of coherent, correlated seismicity (Bak and Tang, 1989; Rundle, 1989; Sornette and Sornette, 1989; Rundle and Klein, 1995; Sammis et al., 1996; 1997; Fisher et al., 1997; Jaume and Sykes, 1999; Rundle et al., 1999; Tiampo et al., 2002). Analysis of data taken prior to large events reveals that the appearance of the coherent correlated regions is often associated with the future occurrence of major earthquakes in the same areas or other tectonic mechanisms such as aseismic slip events (Tiampo et al., 2002). We proceed to detail this pattern dynamics methodology and then identify systematic space-time variations in the seismicity from several tectonic regions.

  16. Vulnerability analysis for design of bridge health monitoring system

    NASA Astrophysics Data System (ADS)

    Sun, L. M.; Yu, G.

    2010-03-01

    The recent engineering implementation of health monitoring system for long span bridges show difficulties for precisely assessing structural physical condition as well as for accurately alarming on structural damages, although hundreds of sensors were installed on a structure and a great amount of data were collected from the monitoring system. The allocation of sensors and the alarming algorithm are still two of the most important tasks to be considered when designing the structural health monitoring system. Vulnerability, in its original meaning, is the system susceptibility to local damage. For a structural system, the vulnerability can thus be regarded as structural performance susceptibility to local damage of structure. The purpose of this study is to propose concepts and methods of structural vulnerability for determining monitoring components which are more vulnerable than others and the corresponding warning threshold once the damages occur. The structural vulnerability performances to various damage scenarios depend upon structural geometrical topology, loading pattern on the structure and the degradation of component performance. A two-parameters structural vulnerability evaluation method is proposed in this paper. The parameters are the damage consequence and the relative magnitude of the damage scenarios to the structural system, respectively. Structural vulnerability to various damage scenarios can be regarded as the tradeoff between the two parameters. Based on the results of structural vulnerability analysis, the limited structural information from health monitoring can be utilized efficiently. The approach of the design of bridge health monitoring system is illustrated for a cable-stayed bridge.

  17. Constraints on Long-Term Seismic Hazard From Vulnerable Stalagmites for the surroundings of Katerloch cave, Austria

    NASA Astrophysics Data System (ADS)

    Gribovszki, Katalin; Bokelmann, Götz; Mónus, Péter; Kovács, Károly; Kalmár, János

    2016-04-01

    Earthquakes hit urban centers in Europe infrequently, but occasionally with disastrous effects. This raises the important issue for society, how to react to the natural hazard: potential damages are huge, and infrastructure costs for addressing these hazards are huge as well. Obtaining an unbiased view of seismic hazard (and risk) is very important therefore. In principle, the best way to test Probabilistic Seismic Hazard Assessments (PSHA) is to compare with observations that are entirely independent of the procedure used to produce the PSHA models. Arguably, the most valuable information in this context should be information on long-term hazard, namely maximum intensities (or magnitudes) occurring over time intervals that are at least as long as a seismic cycle. Such information would be very valuable, even if it concerned only a single site. Long-term information can in principle be gained from intact stalagmites in natural karstic caves. These have survived all earthquakes that have occurred, over thousands of years - depending on the age of the stalagmite. Their "survival" requires that the horizontal ground acceleration has never exceeded a certain critical value within that period. We are focusing here on a case study from the Katerloch cave close to the city of Graz, Austria. A specially-shaped (candle stick style: high, slim, and more or less cylindrical form) intact and vulnerable stalagmites (IVSTM) in the Katerloch cave has been examined in 2013 and 2014. This IVSTM is suitable for estimating the upper limit for horizontal peak ground acceleration generated by pre-historic earthquakes. For this cave, we have extensive information about ages (e.g., Boch et al., 2006, 2010). The approach, used in our study, yields significant new constraints on seismic hazard, as the intactness of the stalagmites suggests that tectonic structures close to Katerloch cave, i.p. the Mur-Mürz fault did not generate very strong paleoearthquakes in the last few thousand years

  18. Nonlinear seismic analysis of a large sodium pump

    SciTech Connect

    Huang, S.N.

    1985-01-01

    The bearings and seismic bumpers used in a large sodium pump of a typical breeder reactor plant may need to be characterized by nonlinear springs and gaps. Then, nonlinear seismic analysis utilizing the time-history method is an effective way to predict the pump behaviors during seismic events, especially at those bearing and seismic bumper areas. In this study, synthesized time histories were developed based on specified seismic response spectra. A nonlinear seismic analysis was then conducted and results were compared with those obtained by linear seismic analysis using the response spectrum method. In contrast to some previous nonlinear analysis trends, the bearing impact forces predicted by nonlinear analysis were higher than those obtained by the response spectrum method. This might be due to the larger gaps and stiffer bearing supports used in this specific pump. However, at locations distant from the impact source, the nonlinear seismic analysis has predicted slightly less responses than those obtained by linear seismic analysis. The seismically induced bearing impact forces were used to study the friction induced thermal stresses on the hydrostatic bearing and to predict the coastdown time of the pump. Results and discussions are presented.

  19. GIS modelling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran city using multi-criteria decision-making (MCDM) techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2014-09-01

    The main issue in determining the seismic vulnerability is having a comprehensive view to all probable damages related to earthquake occurrence. Therefore, taking factors such as peak ground acceleration (PGA) in the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education, the physical distance to a hospitals (or medical care centers), etc. into account and categorized under four indicators of geotechnical, structural, social and physical distance to needed facilities and distance from dangerous ones will provide us with a better and more exact outcome. To this end in this paper using analytic hierarchy process (AHP), the amount of importance of criteria or alternatives are determined and using geographical information system (GIS), the vulnerability of Tehran metropolitan as a result of an earthquake, is studied. This study focuses on the fact that Tehran is surrounded by three active and major faults of the Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran city is analysed and classified into four levels including high, medium, low and safe. The results show that regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey respectively make 6, 16 and 10% of Tehran area highly vulnerable and also 34, 14 and 27% are safe.

  20. GIS modeling of seismic vulnerability of residential fabrics considering geotechnical, structural, social and physical distance indicators in Tehran using multi-criteria decision-making techniques

    NASA Astrophysics Data System (ADS)

    Rezaie, F.; Panahi, M.

    2015-03-01

    The main issue in determining seismic vulnerability is having a comprehensive view of all probable damages related to earthquake occurrence. Therefore, taking into account factors such as peak ground acceleration at the time of earthquake occurrence, the type of structures, population distribution among different age groups, level of education and the physical distance to hospitals (or medical care centers) and categorizing them into four indicators of geotechnical, structural, social and physical distance to needed facilities and from dangerous ones will provide us with a better and more exact outcome. To this end, this paper uses the analytic hierarchy process to study the importance of criteria or alternatives and uses the geographical information system to study the vulnerability of Tehran to an earthquake. This study focuses on the fact that Tehran is surrounded by three active and major faults: Mosha, North Tehran and Rey. In order to comprehensively determine the vulnerability, three scenarios are developed. In each scenario, seismic vulnerability of different areas in Tehran is analyzed and classified into four levels: high, medium, low and safe. The results show that, regarding seismic vulnerability, the faults of Mosha, North Tehran and Rey make, respectively, 6, 16 and 10% of Tehran highly vulnerable, while 34, 14 and 27% is safe.

  1. A Methodology For Flood Vulnerability Analysis In Complex Flood Scenarios

    NASA Astrophysics Data System (ADS)

    Figueiredo, R.; Martina, M. L. V.; Dottori, F.

    2015-12-01

    Nowadays, flood risk management is gaining importance in order to mitigate and prevent flood disasters, and consequently the analysis of flood vulnerability is becoming a key research topic. In this paper, we propose a methodology for large-scale analysis of flood vulnerability. The methodology is based on a GIS-based index, which considers local topography, terrain roughness and basic information about the flood scenario to reproduce the diffusive behaviour of floodplain flow. The methodology synthetizes the spatial distribution of index values into maps and curves, used to represent the vulnerability in the area of interest. Its application allows for considering different levels of complexity of flood scenarios, from localized flood defence failures to complex hazard scenarios involving river reaches. The components of the methodology are applied and tested in two floodplain areas in Northern Italy recently affected by floods. The results show that the methodology can provide an original and valuable insight of flood vulnerability variables and processes.

  2. New Codes for Ambient Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Duret, F.; Mooney, W. D.; Detweiler, S.

    2007-12-01

    In order to determine a velocity model of the crust, scientists generally use earthquakes recorded by seismic stations. However earthquakes do not occur continuously and most are too weak to be useful. When no event is recorded, a waveform is generally considered to be noise. This noise, however, is not useless and carries a wealth of information. Thus, ambient seismic noise analysis is an inverse method of investigating the Earth's interior. Until recently, this technique was quite difficult to apply, as it requires significant computing capacities. In early 2007, however, a team led by Gregory Benson and Mike Ritzwoller from UC Boulder published a paper describing a new method for extracting group and phase velocities from those waveforms. The analysis consisting of recovering Green functions between a pair of stations, is composed of four steps: 1) single station data preparation, 2) cross-correlation and stacking, 3) quality control and data selection and 4) dispersion measurements. At the USGS, we developed a set of ready-to-use computing codes for analyzing waveforms to run the ambient noise analysis of Benson et al. (2007). Our main contribution to the analysis technique was to fully automate the process. The computation codes were written in Fortran 90 and the automation scripts were written in Perl. Furthermore, some operations were run with SAC. Our choices of programming language offer an opportunity to adapt our codes to the major platforms. The codes were developed under Linux but are meant to be adapted to Mac OS X and Windows platforms. The codes have been tested on Southern California data and our results compare nicely with those from the UC Boulder team. Next, we plan to apply our codes to Indonesian data, so that we might take advantage of newly upgraded seismic stations in that region.

  3. Vulnerability-attention analysis for space-related activities

    NASA Technical Reports Server (NTRS)

    Ford, Donnie; Hays, Dan; Lee, Sung Yong; Wolfsberger, John

    1988-01-01

    Techniques for representing and analyzing trouble spots in structures and processes are discussed. Identification of vulnerable areas usually depends more on particular and often detailed knowledge than on algorithmic or mathematical procedures. In some cases, machine inference can facilitate the identification. The analysis scheme proposed first establishes the geometry of the process, then marks areas that are conditionally vulnerable. This provides a basis for advice on the kinds of human attention or machine sensing and control that can make the risks tolerable.

  4. WHE-PAGER Project: A new initiative in estimating global building inventory and its seismic vulnerability

    USGS Publications Warehouse

    Porter, K.A.; Jaiswal, K.S.; Wald, D.J.; Greene, M.; Comartin, Craig

    2008-01-01

    The U.S. Geological Survey’s Prompt Assessment of Global Earthquake’s Response (PAGER) Project and the Earthquake Engineering Research Institute’s World Housing Encyclopedia (WHE) are creating a global database of building stocks and their earthquake vulnerability. The WHE already represents a growing, community-developed public database of global housing and its detailed structural characteristics. It currently contains more than 135 reports on particular housing types in 40 countries. The WHE-PAGER effort extends the WHE in several ways: (1) by addressing non-residential construction; (2) by quantifying the prevalence of each building type in both rural and urban areas; (3) by addressing day and night occupancy patterns, (4) by adding quantitative vulnerability estimates from judgment or statistical observation; and (5) by analytically deriving alternative vulnerability estimates using in part laboratory testing.

  5. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  6. Seismic Vulnerability Evaluations Within The Structural And Functional Survey Activities Of The COM Bases In Italy

    SciTech Connect

    Zuccaro, G.; Cacace, F.; Albanese, V.; Mercuri, C.; Papa, F.; Pizza, A. G.; Sergio, S.; Severino, M.

    2008-07-08

    The paper describes technical and functional surveys on COM buildings (Mixed Operative Centre). This activity started since 2005, with the contribution of both Italian Civil Protection Department and the Regions involved. The project aims to evaluate the efficiency of COM buildings, checking not only structural, architectonic and functional characteristics but also paying attention to surrounding real estate vulnerability, road network, railways, harbours, airports, area morphological and hydro-geological characteristics, hazardous activities, etc. The first survey was performed in eastern Sicily, before the European Civil Protection Exercise 'EUROSOT 2005'. Then, since 2006, a new survey campaign started in Abruzzo, Molise, Calabria and Puglia Regions. The more important issue of the activity was the vulnerability assessment. So this paper deals with a more refined vulnerability evaluation technique by means of the SAVE methodology, developed in the 1st task of SAVE project within the GNDT-DPC programme 2000-2002 (Zuccaro, 2005); the SAVE methodology has been already successfully employed in previous studies (i.e. school buildings intervention programme at national scale; list of strategic public buildings in Campania, Sicilia and Basilicata). In this paper, data elaborated by SAVE methodology are compared with expert evaluations derived from the direct inspections on COM buildings. This represents a useful exercise for the improvement either of the survey forms or of the methodology for the quick assessment of the vulnerability.

  7. Seismic analysis of freestanding fuel racks

    SciTech Connect

    Gilmore, C.B.

    1982-01-01

    This paper presents a nonlinear transient dynamic time-history analysis of freestanding spent fuel storage racks subjected to seismic excitation. This type of storage rack is structurally unrestrained and submerged in water in the spent fuel pool of a nuclear power complex, holds (spent) fuel assemblies which have been removed from the reactor core. Nonlinearities in the fuel rack system include impact between the fuel assembly and surrounding cell due to clearances between them, friction due to sliding between the fuel rack support structure and spent fuel pool floor, and the lift-off of the fuel rack support structure from the spent fuel pool floor. The analysis of the fuel rack system includes impacting due to gap closures, energy losses due to impacting bodies, Coulomb damping between sliding surfaces, and hydrodynamic mass effects. Acceleration time history excitation development is discussed. Modeling considerations, such as the initial status of nonlinear elements, number of mode shapes to include in the analysis, modal damping, and integration time-step size are presented. The response of the fuel rack subjected to two-dimensional seismic excitation is analyzed by the modal superposition method, which has resulted in significant computer cost savings when compared to that of direct integration.

  8. Malware Sandbox Analysis for Secure Observation of Vulnerability Exploitation

    NASA Astrophysics Data System (ADS)

    Yoshioka, Katsunari; Inoue, Daisuke; Eto, Masashi; Hoshizawa, Yuji; Nogawa, Hiroki; Nakao, Koji

    Exploiting vulnerabilities of remote systems is one of the fundamental behaviors of malware that determines their potential hazards. Understanding what kind of propagation tactics each malware uses is essential in incident response because such information directly links with countermeasures such as writing a signature for IDS. Although recently malware sandbox analysis has been studied intensively, little work is done on securely observing the vulnerability exploitation by malware. In this paper, we propose a novel sandbox analysis method for securely observing malware's vulnerability exploitation in a totally isolated environment. In our sandbox, we prepare two victim hosts. We first execute the sample malware on one of these hosts and then let it attack the other host which is running multiple vulnerable services. As a simple realization of the proposed method, we have implemented a sandbox using Nepenthes, a low-interaction honeypot, as the second victim. Because Nepenthes can emulate a variety of vulnerable services, we can efficiently observe the propagation of sample malware. In the experiments, among 382 samples whose scan capabilities are confirmed, 381 samples successfully started exploiting vulnerabilities of the second victim. This indicates the certain level of feasibility of the proposed method.

  9. A seismic hazard uncertainty analysis for the New Madrid seismic zone

    USGS Publications Warehouse

    Cramer, C.H.

    2001-01-01

    A review of the scientific issues relevant to characterizing earthquake sources in the New Madrid seismic zone has led to the development of a logic tree of possible alternative parameters. A variability analysis, using Monte Carlo sampling of this consensus logic tree, is presented and discussed. The analysis shows that for 2%-exceedence-in-50-year hazard, the best-estimate seismic hazard map is similar to previously published seismic hazard maps for the area. For peak ground acceleration (PGA) and spectral acceleration at 0.2 and 1.0 s (0.2 and 1.0 s Sa), the coefficient of variation (COV) representing the knowledge-based uncertainty in seismic hazard can exceed 0.6 over the New Madrid seismic zone and diminishes to about 0.1 away from areas of seismic activity. Sensitivity analyses show that the largest contributor to PGA, 0.2 and 1.0 s Sa seismic hazard variability is the uncertainty in the location of future 1811-1812 New Madrid sized earthquakes. This is followed by the variability due to the choice of ground motion attenuation relation, the magnitude for the 1811-1812 New Madrid earthquakes, and the recurrence interval for M>6.5 events. Seismic hazard is not very sensitive to the variability in seismogenic width and length. Published by Elsevier Science B.V.

  10. K-means cluster analysis and seismicity partitioning for Pakistan

    NASA Astrophysics Data System (ADS)

    Rehman, Khaista; Burton, Paul W.; Weatherill, Graeme A.

    2014-07-01

    Pakistan and the western Himalaya is a region of high seismic activity located at the triple junction between the Arabian, Eurasian and Indian plates. Four devastating earthquakes have resulted in significant numbers of fatalities in Pakistan and the surrounding region in the past century (Quetta, 1935; Makran, 1945; Pattan, 1974 and the recent 2005 Kashmir earthquake). It is therefore necessary to develop an understanding of the spatial distribution of seismicity and the potential seismogenic sources across the region. This forms an important basis for the calculation of seismic hazard; a crucial input in seismic design codes needed to begin to effectively mitigate the high earthquake risk in Pakistan. The development of seismogenic source zones for seismic hazard analysis is driven by both geological and seismotectonic inputs. Despite the many developments in seismic hazard in recent decades, the manner in which seismotectonic information feeds the definition of the seismic source can, in many parts of the world including Pakistan and the surrounding regions, remain a subjective process driven primarily by expert judgment. Whilst much research is ongoing to map and characterise active faults in Pakistan, knowledge of the seismogenic properties of the active faults is still incomplete in much of the region. Consequently, seismicity, both historical and instrumental, remains a primary guide to the seismogenic sources of Pakistan. This study utilises a cluster analysis approach for the purposes of identifying spatial differences in seismicity, which can be utilised to form a basis for delineating seismogenic source regions. An effort is made to examine seismicity partitioning for Pakistan with respect to earthquake database, seismic cluster analysis and seismic partitions in a seismic hazard context. A magnitude homogenous earthquake catalogue has been compiled using various available earthquake data. The earthquake catalogue covers a time span from 1930 to 2007 and

  11. An approach to extend seismic vulnerability relationships for large diameter pipelines

    SciTech Connect

    Honegger, D.G.

    1995-12-31

    The most common approach to determining vulnerability is to rely solely upon damage data from past earthquakes as a predictor of future performance. Relying upon past damage data is not an option when data does not exist for a particular type of pipeline. An option discussed in this paper and recently implemented for a large diameter water supply pipelines, relies upon engineering characterization of the relative strength of pipelines to extend existing damage data.

  12. Vulnerability Analysis Considerations for the Transportation of Special Nuclear Material

    SciTech Connect

    Nicholson, Lary G.; Purvis, James W.

    1999-07-21

    The vulnerability analysis methodology developed for fixed nuclear material sites has proven to be extremely effective in assessing associated transportation issues. The basic methods and techniques used are directly applicable to conducting a transportation vulnerability analysis. The purpose of this paper is to illustrate that the same physical protection elements (detection, delay, and response) are present, although the response force plays a dominant role in preventing the theft or sabotage of material. Transportation systems are continuously exposed to the general public whereas the fixed site location by its very nature restricts general public access.

  13. Central Anatolian Seismic Network: Initial Analysis of the Seismicity and Earth Structure

    NASA Astrophysics Data System (ADS)

    Arda Özacar, A.; Abgarmi, Bizhan; Delph, Jonathan; Beck, Susan L.; Sandvol, Eric; Türkelli, Niyazi; Kalafat, Doğan; Kahraman, Metin; Teoman, Uğur

    2015-04-01

    Anatolian Microplate provides many of the clues to understand the geodynamic processes leading to continental collision, plateau formation, slab tearing / break-off and development of escape tectonics. During last decades, the tectonic evolution and dynamics of Anatolia has been the prime target of numerous research efforts employing wide spectrum of disciplines. However the Anatolian interior which is characterized by large magnitude lateral and vertical displacements, widespread Neogene volcanism and a complex tectonic history, is still under much debate and require a joint multidisciplinary approach to investigate the mantle-to-surface dynamics. In order to identify the crust and mantle structure beneath Central Anatolia and related seismicity, a dense seismic array that consists of 70 broadband seismic stations was deployed temporarily in 2013 as a part of the Central Anatolian Tectonics (CAT) project on continental dynamics. A year of seismic record has been processed and part of it was analyzed using various seismic methods. Distribution of preliminary earthquake locations supports the presence of seismic activity partly localized along major tectonic structures across the region. According ambient noise tomography results, upper crustal seismic velocity variations correlate well with surface geology while slow shear wave velocities dominate the lower crust indicating a weaker crustal rheology at the bottom. Furthermore, analysis of teleseismic P wave receiver functions revealed the presence of crustal low velocity zones associated to Neogene volcanism and sharp Moho variations near tectonic sutures and faults. By combining this new dataset with seismic data recorded by previous seismic deployments and national networks, we will have a complete seismic coverage for the entire region allowing researchers to image beneath Anatolia from mantle to surface with high resolution.

  14. Seismic Isolation Working Meeting Gap Analysis Report

    SciTech Connect

    Coleman, Justin; Sabharwall, Piyush

    2014-09-01

    The ultimate goal in nuclear facility and nuclear power plant operations is operating safety during normal operations and maintaining core cooling capabilities during off-normal events including external hazards. Understanding the impact external hazards, such as flooding and earthquakes, have on nuclear facilities and NPPs is critical to deciding how to manage these hazards to expectable levels of risk. From a seismic risk perspective the goal is to manage seismic risk. Seismic risk is determined by convolving the seismic hazard with seismic fragilities (capacity of systems, structures, and components (SSCs)). There are large uncertainties associated with evolving nature of the seismic hazard curves. Additionally there are requirements within DOE and potential requirements within NRC to reconsider updated seismic hazard curves every 10 years. Therefore opportunity exists for engineered solutions to manage this seismic uncertainty. One engineered solution is seismic isolation. Current seismic isolation (SI) designs (used in commercial industry) reduce horizontal earthquake loads and protect critical infrastructure from the potentially destructive effects of large earthquakes. The benefit of SI application in the nuclear industry is being recognized and SI systems have been proposed, in the American Society of Civil Engineers (ASCE) 4 standard, to be released in 2014, for Light Water Reactors (LWR) facilities using commercially available technology. However, there is a lack of industry application to the nuclear industry and uncertainty with implementing the procedures outlined in ASCE-4. Opportunity exists to determine barriers associated with implementation of current ASCE-4 standard language.

  15. Key parameter optimization and analysis of stochastic seismic inversion

    NASA Astrophysics Data System (ADS)

    Huang, Zhe-Yuan; Gan, Li-Deng; Dai, Xiao-Feng; Li, Ling-Gao; Wang, Jun

    2012-03-01

    Stochastic seismic inversion is the combination of geostatistics and seismic inversion technology which integrates information from seismic records, well logs, and geostatistics into a posterior probability density function (PDF) of subsurface models. The Markov chain Monte Carlo (MCMC) method is used to sample the posterior PDF and the subsurface model characteristics can be inferred by analyzing a set of the posterior PDF samples. In this paper, we first introduce the stochastic seismic inversion theory, discuss and analyze the four key parameters: seismic data signal-to-noise ratio (S/N), variogram, the posterior PDF sample number, and well density, and propose the optimum selection of these parameters. The analysis results show that seismic data S/N adjusts the compromise between the influence of the seismic data and geostatistics on the inversion results, the variogram controls the smoothness of the inversion results, the posterior PDF sample number determines the reliability of the statistical characteristics derived from the samples, and well density influences the inversion uncertainty. Finally, the comparison between the stochastic seismic inversion and the deterministic model based seismic inversion indicates that the stochastic seismic inversion can provide more reliable information of the subsurface character.

  16. A Preliminary Tsunami Vulnerability Analysis for Yenikapi Region in Istanbul

    NASA Astrophysics Data System (ADS)

    Ceren Cankaya, Zeynep; Suzen, Lutfi; Cevdet Yalciner, Ahmet; Kolat, Cagil; Aytore, Betul; Zaytsev, Andrey

    2015-04-01

    One of the main requirements during post disaster recovery operations is to maintain proper transportation and fluent communication at the disaster areas. Ports and harbors are the main transportation hubs which must work with proper performance at all times especially after the disasters. Resilience of coastal utilities after earthquakes and tsunamis have major importance for efficient and proper rescue and recovery operations soon after the disasters. Istanbul is a mega city with its various coastal utilities located at the north coast of the Sea of Marmara. At Yenikapi region of Istanbul, there are critical coastal utilities and vulnerable coastal structures and critical activities occur daily. Fishery ports, commercial ports, small craft harbors, passenger terminals of intercity maritime transportation, water front commercial and/or recreational structures are some of the examples of coastal utilization which are vulnerable against marine disasters. Therefore their vulnerability under tsunami or any other marine hazard to Yenikapi region of Istanbul is an important issue. In this study, a methodology of vulnerability analysis under tsunami attack is proposed with the applications to Yenikapi region. In the study, high resolution (1m) GIS database of Istanbul Metropolitan Municipality (IMM) is used and analyzed by using GIS implementation. The bathymetry and topography database and the vector dataset containing all buildings/structures/infrastructures in the study area are obtained for tsunami numerical modeling for the study area. GIS based tsunami vulnerability assessment is conducted by applying the Multi-criteria Decision Making Analysis (MCDA). The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability parameters in the region due to two different classifications i) vulnerability of buildings/structures and ii) vulnerability of (human) evacuation

  17. Nonlinear Seismic Analysis of Morrow Point Dam

    SciTech Connect

    Noble, C R; Nuss, L K

    2004-02-20

    This research and development project was sponsored by the United States Bureau of Reclamation (USBR), who are best known for the dams, power plants, and canals it constructed in the 17 western states. The mission statement of the USBR's Dam Safety Office, located in Denver, Colorado, is ''to ensure Reclamation dams do not present unacceptable risk to people, property, and the environment.'' The Dam Safety Office does this by quickly identifying the dams which pose an increased threat to the public, and quickly completing the related analyses in order to make decisions that will safeguard the public and associated resources. The research study described in this report constitutes one element of USBR's research and development work to advance their computational and analysis capabilities for studying the response of dams to strong earthquake motions. This project focused on the seismic response of Morrow Point Dam, which is located 263 km southwest of Denver, Colorado.

  18. Cascade vulnerability for risk analysis of water infrastructure.

    PubMed

    Sitzenfrei, R; Mair, M; Möderl, M; Rauch, W

    2011-01-01

    One of the major tasks in urban water management is failure-free operation for at least most of the time. Accordingly, the reliability of the network systems in urban water management has a crucial role. The failure of a component in these systems impacts potable water distribution and urban drainage. Therefore, water distribution and urban drainage systems are categorized as critical infrastructure. Vulnerability is the degree to which a system is likely to experience harm induced by perturbation or stress. However, for risk assessment, we usually assume that events and failures are singular and independent, i.e. several simultaneous events and cascading events are unconsidered. Although failures can be causally linked, a simultaneous consideration in risk analysis is hardly considered. To close this gap, this work introduces the term cascade vulnerability for water infrastructure. Cascade vulnerability accounts for cascading and simultaneous events. Following this definition, cascade risk maps are a merger of hazard and cascade vulnerability maps. In this work cascade vulnerability maps for water distribution systems and urban drainage systems based on the 'Achilles-Approach' are introduced and discussed. It is shown, that neglecting cascading effects results in significant underestimation of risk scenarios.

  19. Analysis of sequential exchanges between vulnerable forces

    SciTech Connect

    Canavan, G.H.

    1998-09-04

    A multi-stage and -step analysis of sequences of crises or exchanges shows that aggressiveness on one side can induce rapid counter-value strikes by the other as well and knowledge that opponents will later become less aggressive does not mitigate the tendency to strike early in crises.

  20. Betweenness as a Tool of Vulnerability Analysis of Power System

    NASA Astrophysics Data System (ADS)

    Rout, Gyanendra Kumar; Chowdhury, Tamalika; Chanda, Chandan Kumar

    2016-12-01

    Complex network theory finds its application in analysis of power grid as both share some common characteristics. By using this theory finding critical elements in power network can be achieved. As vulnerabilities of elements of the network decide the vulnerability of the total network, in this paper, vulnerability of each element is studied using two complex network models—betweenness centrality and extended betweenness. The betweenness centrality considers only topological structure of power system whereas extended betweenness is based on both topological and physical properties of the system. In the latter case, some of the electrical properties such as electrical distance, line flow limits, transmission capacities of lines and PTDF matrix are included. The standard IEEE 57 bus system has been studied based upon the above mentioned indices and following conclusions have been discussed.

  1. Multi-waveform classification for seismic facies analysis

    NASA Astrophysics Data System (ADS)

    Song, Chengyun; Liu, Zhining; Wang, Yaojun; Li, Xingming; Hu, Guangmin

    2017-04-01

    Seismic facies analysis provides an effective way to delineate the heterogeneity and compartments within a reservoir. Traditional method is using the single waveform to classify the seismic facies, which does not consider the stratigraphy continuity, and the final facies map may affect by noise. Therefore, by defining waveforms in a 3D window as multi-waveform, we developed a new seismic facies analysis algorithm represented as multi-waveform classification (MWFC) that combines the multilinear subspace learning with self-organizing map (SOM) clustering techniques. In addition, we utilize multi-window dip search algorithm to extract multi-waveform, which reduce the uncertainty of facies maps in the boundaries. Testing the proposed method on synthetic data with different S/N, we confirm that our MWFC approach is more robust to noise than the conventional waveform classification (WFC) method. The real seismic data application on F3 block in Netherlands proves our approach is an effective tool for seismic facies analysis.

  2. Vulnerability analysis for complex networks using aggressive abstraction.

    SciTech Connect

    Colbaugh, Richard; Glass, Kristin L.

    2010-06-01

    Large, complex networks are ubiquitous in nature and society, and there is great interest in developing rigorous, scalable methods for identifying and characterizing their vulnerabilities. This paper presents an approach for analyzing the dynamics of complex networks in which the network of interest is first abstracted to a much simpler, but mathematically equivalent, representation, the required analysis is performed on the abstraction, and analytic conclusions are then mapped back to the original network and interpreted there. We begin by identifying a broad and important class of complex networks which admit vulnerability-preserving, finite state abstractions, and develop efficient algorithms for computing these abstractions. We then propose a vulnerability analysis methodology which combines these finite state abstractions with formal analytics from theoretical computer science to yield a comprehensive vulnerability analysis process for networks of realworld scale and complexity. The potential of the proposed approach is illustrated with a case study involving a realistic electric power grid model and also with brief discussions of biological and social network examples.

  3. Analytical and Experimental Assessment of Seismic Vulnerability of Beam-Column Joints without Transverse Reinforcement in Concrete Buildings

    NASA Astrophysics Data System (ADS)

    Hassan, Wael Mohammed

    Beam-column joints in concrete buildings are key components to ensure structural integrity of building performance under seismic loading. Earthquake reconnaissance has reported the substantial damage that can result from inadequate beam-column joints. In some cases, failure of older-type corner joints appears to have led to building collapse. Since the 1960s, many advances have been made to improve seismic performance of building components, including beam-column joints. New design and detailing approaches are expected to produce new construction that will perform satisfactorily during strong earthquake shaking. Much less attention has been focused on beam-column joints of older construction that may be seismically vulnerable. Concrete buildings constructed prior to developing details for ductility in the 1970s normally lack joint transverse reinforcement. The available literature concerning the performance of such joints is relatively limited, but concerns about performance exist. The current study aimed to improve understanding and assessment of seismic performance of unconfined exterior and corner beam-column joints in existing buildings. An extensive literature survey was performed, leading to development of a database of about a hundred tests. Study of the data enabled identification of the most important parameters and the effect of each parameter on the seismic performance. The available analytical models and guidelines for strength and deformability assessment of unconfined joints were surveyed and evaluated. In particular, The ASCE 41 existing building document proved to be substantially conservative in joint shear strength estimation. Upon identifying deficiencies in these models, two new joint shear strength models, a bond capacity model, and two axial capacity models designed and tailored specifically for unconfined beam-column joints were developed. The proposed models strongly correlated with previous test results. In the laboratory testing phase of

  4. Annotated bibliography, seismicity of and near the island of Hawaii and seismic hazard analysis of the East Rift of Kilauea

    SciTech Connect

    Klein, F.W.

    1994-03-28

    This bibliography is divided into the following four sections: Seismicity of Hawaii and Kilauea Volcano; Occurrence, locations and accelerations from large historical Hawaiian earthquakes; Seismic hazards of Hawaii; and Methods of seismic hazard analysis. It contains 62 references, most of which are accompanied by short abstracts.

  5. An Analysis of the Mt. Meron Seismic Array

    SciTech Connect

    Pasyanos, M E; Ryall, F

    2008-01-10

    We have performed a quick analysis of the Mt. Meron seismic array to monitor regional seismic events in the Middle East. The Meron array is the only current array in the Levant and Arabian Peninsula and, as such, might be useful in contributing to event location, identification, and other analysis. Here, we provide a brief description of the array and a review of the travel time and array analysis done to assess its performance.

  6. Quantitative risk analysis of oil storage facilities in seismic areas.

    PubMed

    Fabbrocino, Giovanni; Iervolino, Iunio; Orlando, Francesca; Salzano, Ernesto

    2005-08-31

    Quantitative risk analysis (QRA) of industrial facilities has to take into account multiple hazards threatening critical equipment. Nevertheless, engineering procedures able to evaluate quantitatively the effect of seismic action are not well established. Indeed, relevant industrial accidents may be triggered by loss of containment following ground shaking or other relevant natural hazards, either directly or through cascade effects ('domino effects'). The issue of integrating structural seismic risk into quantitative probabilistic seismic risk analysis (QpsRA) is addressed in this paper by a representative study case regarding an oil storage plant with a number of atmospheric steel tanks containing flammable substances. Empirical seismic fragility curves and probit functions, properly defined both for building-like and non building-like industrial components, have been crossed with outcomes of probabilistic seismic hazard analysis (PSHA) for a test site located in south Italy. Once the seismic failure probabilities have been quantified, consequence analysis has been performed for those events which may be triggered by the loss of containment following seismic action. Results are combined by means of a specific developed code in terms of local risk contour plots, i.e. the contour line for the probability of fatal injures at any point (x, y) in the analysed area. Finally, a comparison with QRA obtained by considering only process-related top events is reported for reference.

  7. Stochastic seismic analysis in the Messina strait area

    SciTech Connect

    Cacciola, P.; Maugeri, N.; Muscolino, G.

    2008-07-08

    After 1908 Messina earthquake significant progresses have been carried out in the field of earthquake engineering. Usually seismic action is represented via the so called elastic response spectrum or alternatively by time histories of ground motion acceleration. Due the random nature of the seismic action, alternative representations assume the seismic action as zero-mean Gaussian process fully defined by the so-called Power Spectral Density function. Aim of this paper is the comparative study of the response of linear behaving structures adopting the above representation of the seismic action using recorded earthquakes in the Messina strait area. In this regard, a handy method for determining the power spectral density function of recorded earthquakes is proposed. Numerical examples conducted on the existing space truss located in Torre Faro (Messina) will show the effectiveness of stochastic approach for coping with the seismic analysis of structures.

  8. Stochastic seismic analysis in the Messina strait area

    NASA Astrophysics Data System (ADS)

    Cacciola, P.; Maugeri, N.; Muscolino, G.

    2008-07-01

    After 1908 Messina earthquake significant progresses have been carried out in the field of earthquake engineering. Usually seismic action is represented via the so called elastic response spectrum or alternatively by time histories of ground motion acceleration. Due the random nature of the seismic action, alternative representations assume the seismic action as zero-mean Gaussian process fully defined by the so-called Power Spectral Density function. Aim of this paper is the comparative study of the response of linear behaving structures adopting the above representation of the seismic action using recorded earthquakes in the Messina strait area. In this regard, a handy method for determining the power spectral density function of recorded earthquakes is proposed. Numerical examples conducted on the existing space truss located in Torre Faro (Messina) will show the effectiveness of stochastic approach for coping with the seismic analysis of structures.

  9. A Preliminary Tsunami vulnerability analysis for Bakirkoy district in Istanbul

    NASA Astrophysics Data System (ADS)

    Tufekci, Duygu; Lutfi Suzen, M.; Cevdet Yalciner, Ahmet; Zaytsev, Andrey

    2016-04-01

    Resilience of coastal utilities after earthquakes and tsunamis has major importance for efficient and proper rescue and recovery operations soon after the disasters. Vulnerability assessment of coastal areas under extreme events has major importance for preparedness and development of mitigation strategies. The Sea of Marmara has experienced numerous earthquakes as well as associated tsunamis. There are variety of coastal facilities such as ports, small craft harbors, and terminals for maritime transportation, water front roads and business centers mainly at North Coast of Marmara Sea in megacity Istanbul. A detailed vulnerability analysis for Yenikapi region and a detailed resilience analysis for Haydarpasa port in Istanbul have been studied in previously by Cankaya et al., (2015) and Aytore et al., (2015) in SATREPS project. In this study, the methodology of vulnerability analysis under tsunami attack given in Cankaya et al., (2015) is modified and applied to Bakirkoy district of Istanbul. Bakirkoy district is located at western part of Istanbul and faces to the North Coast of Marmara Sea from 28.77oE to 28.89oE. High resolution spatial dataset of Istanbul Metropolitan Municipality (IMM) is used and analyzed. The bathymetry and topography database and the spatial dataset containing all buildings/structures/infrastructures in the district are collated and utilized for tsunami numerical modeling and following vulnerability analysis. The tsunami parameters from deterministically defined worst case scenarios are computed from the simulations using tsunami numerical model NAMI DANCE. The vulnerability assessment parameters in the district according to vulnerability and resilience are defined; and scored by implementation of a GIS based TVA with appropriate MCDA methods. The risk level is computed using tsunami intensity (level of flow depth from simulations) and TVA results at every location in Bakirkoy district. The preliminary results are presented and discussed

  10. Multidimensional seismic data reconstruction using tensor analysis

    NASA Astrophysics Data System (ADS)

    Kreimer, Nadia

    Exploration seismology utilizes the seismic wavefield for prospecting oil and gas. The seismic reflection experiment consists on deploying sources and receivers in the surface of an area of interest. When the sources are activated, the receivers measure the wavefield that is reflected from different subsurface interfaces and store the information as time-series called traces or seismograms. The seismic data depend on two source coordinates, two receiver coordinates and time (a 5D volume). Obstacles in the field, logistical and economical factors constrain seismic data acquisition. Therefore, the wavefield sampling is incomplete in the four spatial dimensions. Seismic data undergoes different processes. In particular, the reconstruction process is responsible for correcting sampling irregularities of the seismic wavefield. This thesis focuses on the development of new methodologies for the reconstruction of multidimensional seismic data. This thesis examines techniques based on tensor algebra and proposes three methods that exploit the tensor nature of the seismic data. The fully sampled volume is low-rank in the frequency-space domain. The rank increases when we have missing traces and/or noise. The methods proposed perform rank reduction on frequency slices of the 4D spatial volume. The first method employs the Higher-Order Singular Value Decomposition (HOSVD) immersed in an iterative algorithm that reinserts weighted observations. The second method uses a sequential truncated SVD on the unfoldings of the tensor slices (SEQ-SVD). The third method formulates the rank reduction problem as a convex optimization problem. The measure of the rank is replaced by the nuclear norm of the tensor and the alternating direction method of multipliers (ADMM) minimizes the cost function. All three methods have the interesting property that they are robust to curvature of the reflections, unlike many reconstruction methods. Finally, we present a comparison between the methods

  11. Analysis of Brazilian data for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Drouet, S.; Assumpção, M.

    2013-05-01

    Seismic hazard analysis in Brazil is going to be re-assessed in the framework of the Global Earthquake Model (GEM) project. Since the last worldwide Global Seismic Hazard Analysis Project (GSHAP) there has been no specific study in this field in Brazil. Brazil is a stable continental region and is characterized by a low seismic activity. In this particular type of regions, seismic hazard assessment is a very hard task due to the limited amount of data available regarding the seismic sources, earthquake catalogue, or ground-motion amplitudes, and the uncertainties associated are very large. This study focuses on recorded data in South-East Brazil where broadband stations are installed, belonging to two networks: the network managed by the seismology group at the IAG-USP in São Paulo which exists since about 20 years, and the network managed by the Observatorio Nacional in Rio de Janeiro which has just been set up. The two networks are now integrated into the national network RSB (Rede Sismográfica Brasileira) which will also include stations from the rest of Brazil currently in installation by the Universities of Brasilia and Natal. There are a couple of events with magnitude greater than 3 recorded at these very sensitive stations, usually at rather large distances. At first sight these data may appear meaningless in the context of seismic hazard but they can help to improve different parts involved in the process. The analysis of the S-wave Fourier spectra can help to better resolve source, path and site effects in Brazil. For instance moment magnitudes can be computed from the flat part of the Fourier spectra. These magnitudes are of utmost importance in order to build an homogeneous catalogue in terms of moment magnitude. At the moment only body wave magnitude (or some equivalent scale) are determined routinely for the events in Brazil. Attenuation and site effect, especially the high-frequency attenuation known as the kappa effect will also help to

  12. Temperature-based Instanton Analysis: Identifying Vulnerability in Transmission Networks

    SciTech Connect

    Kersulis, Jonas; Hiskens, Ian; Chertkov, Michael; Backhaus, Scott N.; Bienstock, Daniel

    2015-04-08

    A time-coupled instanton method for characterizing transmission network vulnerability to wind generation fluctuation is presented. To extend prior instanton work to multiple-time-step analysis, line constraints are specified in terms of temperature rather than current. An optimization formulation is developed to express the minimum wind forecast deviation such that at least one line is driven to its thermal limit. Results are shown for an IEEE RTS-96 system with several wind-farms.

  13. Detecting seismic activity with a covariance matrix analysis of data recorded on seismic arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N. M.; de Rosny, J.; Brenguier, F.; Landès, M.

    2016-03-01

    Modern seismic networks are recording the ground motion continuously at the Earth's surface, providing dense spatial samples of the seismic wavefield. The aim of our study is to analyse these records with statistical array-based approaches to identify coherent time-series as a function of time and frequency. Using ideas mainly brought from the random matrix theory, we analyse the spatial coherence of the seismic wavefield from the width of the covariance matrix eigenvalue distribution. We propose a robust detection method that could be used for the analysis of weak and emergent signals embedded in background noise, such as the volcanic or tectonic tremors and local microseismicity, without any prior knowledge about the studied wavefields. We apply our algorithm to the records of the seismic monitoring network of the Piton de la Fournaise volcano located at La Réunion Island and composed of 21 receivers with an aperture of ˜15 km. This array recorded many teleseismic earthquakes as well as seismovolcanic events during the year 2010. We show that the analysis of the wavefield at frequencies smaller than ˜0.1 Hz results in detection of the majority of teleseismic events from the Global Centroid Moment Tensor database. The seismic activity related to the Piton de la Fournaise volcano is well detected at frequencies above 1 Hz.

  14. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    SciTech Connect

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  15. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY TC; RINKER MW; CARPENTER BG; HENDRIX C; ABATT FG

    2009-01-15

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratories (PNNL) to perform seismic analysis of the Hanford Site Double-Shell Tanks (DSTs) in support of a project entitled Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Analyses. The original scope of the project was to complete an up-to-date comprehensive analysis of record of the DST System at Hanford in support of Tri-Party Agreement Milestone M-48-14. The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). Although Milestone M-48-14 has been met, Revision I is being issued to address external review comments with emphasis on changes in the modeling of anchor bolts connecting the concrete dome and the steel primary tank. The work statement provided to M&D (PNNL 2003) required that a nonlinear soil structure interaction (SSI) analysis be performed on the DSTs. The analysis is required to include the effects of sliding interfaces and fluid sloshing (fluid-structure interaction). SSI analysis has traditionally been treated by frequency domain computer codes such as SHAKE (Schnabel, et al. 1972) and SASSI (Lysmer et al. 1999a). Such frequency domain programs are limited to the analysis of linear systems. Because of the contact surfaces, the response of the DSTs to a seismic event is inherently nonlinear and consequently outside the range of applicability of the linear frequency domain programs. That is, the nonlinear response of the DSTs to seismic excitation requires the use of a time domain code. The capabilities and limitations of the commercial time domain codes ANSYS{reg_sign} and MSC Dytran{reg_sign} for performing seismic SSI analysis of the DSTs and the methodology required to perform the detailed seismic analysis of the DSTs has been addressed in Rinker et al (2006a). On the basis of the results reported in Rinker et al

  16. A vibration-based health monitoring program for a large and seismically vulnerable masonry dome

    NASA Astrophysics Data System (ADS)

    Pecorelli, M. L.; Ceravolo, R.; De Lucia, G.; Epicoco, R.

    2017-05-01

    Vibration-based health monitoring of monumental structures must rely on efficient and, as far as possible, automatic modal analysis procedures. Relatively low excitation energy provided by traffic, wind and other sources is usually sufficient to detect structural changes, as those produced by earthquakes and extreme events. Above all, in-operation modal analysis is a non-invasive diagnostic technique that can support optimal strategies for the preservation of architectural heritage, especially if complemented by model-driven procedures. In this paper, the preliminary steps towards a fully automated vibration-based monitoring of the world’s largest masonry oval dome (internal axes of 37.23 by 24.89 m) are presented. More specifically, the paper reports on signal treatment operations conducted to set up the permanent dynamic monitoring system of the dome and to realise a robust automatic identification procedure. Preliminary considerations on the effects of temperature on dynamic parameters are finally reported.

  17. Seismic Hazard Analysis as a Controlling Technique of Induced Seismicity in Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Convertito, V.; Sharma, N.; Maercklin, N.; Emolo, A.; Zollo, A.

    2011-12-01

    The effect of induced seismicity of geothermal systems during stimulation and fluid circulation can cover a wide range of values from light and unfelt to severe and damaging. If the design of a modern geothermal system requires the largest efficiency to be obtained from the social point of view it is required that the system could be managed in order to reduce possible impact in advance. In this framework, automatic control of the seismic response of the stimulated reservoir is nowadays mandatory, particularly in proximity of densely populated areas. Recently, techniques have been proposed for this purpose mainly based on the concept of the traffic light. This system provides a tool to decide the level of stimulation rate based on the real-time analysis of the induced seismicity and the ongoing ground motion values. However, in some cases the induced effect can be delayed with respect to the time when the reservoir is stimulated. Thus, a controlling system technique able to estimate the ground motion levels for different time scales can help to better control the geothermal system. Here we present an adaptation of the classical probabilistic seismic hazard analysis to the case where the seismicity rate as well as the propagation medium properties are not constant with time. We use a non-homogeneous seismicity model for modeling purposes, in which the seismicity rate and b-value of the recurrence relationship change with time. Additionally, as a further controlling procedure, we propose a moving time window analysis of the recorded peak ground-motion values aimed at monitoring the changes in the propagation medium. In fact, for the same set of magnitude values recorded at the same stations, we expect that on average peak ground motion values attenuate in same way. As a consequence, the residual differences can be reasonably ascribed to changes in medium properties. These changes can be modeled and directly introduced in the hazard integral. We applied the proposed

  18. Seismic refraction analysis: the path forward

    USGS Publications Warehouse

    Haines, Seth S.; Zelt, Colin; Doll, William

    2012-01-01

    Seismic Refraction Methods: Unleashing the Potential and Understanding the Limitations; Tucson, Arizona, 29 March 2012 A workshop focused on seismic refraction methods took place on 29 May 2012, associated with the 2012 Symposium on the Application of Geophysics to Engineering and Environmental Problems. This workshop was convened to assess the current state of the science and discuss paths forward, with a primary focus on near-surface problems but with an eye on all applications. The agenda included talks on these topics from a number of experts interspersed with discussion and a dedicated discussion period to finish the day. Discussion proved lively at times, and workshop participants delved into many topics central to seismic refraction work.

  19. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    NASA Astrophysics Data System (ADS)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  20. A transferable approach towards rapid inventory data capturing for seismic vulnerability assessment using open-source geospatial technologies

    NASA Astrophysics Data System (ADS)

    Wieland, M.; Pittore, M.; Parolai, S.; Zschau, J.

    2012-04-01

    Geospatial technologies are increasingly being used in pre-disaster vulnerability assessment and post-disaster impact assessment for different types of hazards. Especially the use of remote sensing data has been strongly promoted in recent years due to its capabilities of providing up-to-date information over large areas at a comparatively low cost with increasingly high spatial, temporal and spectral resolution. Despite its clear potentials, a purely remote sensing based approach has its limitations in that it is only capable of providing information about the birds-eye view of the objects of interest. The use of omnidirectional imaging in addition can provide the necessary street-view that furthermore allows for a rapid visual screening of a buildings façade. In this context, we propose an integrated approach to rapid inventory data capturing for the assessment of structural vulnerability of buildings in case of an earthquake. Globally available low-cost data sources are preferred and the tools are developed on an open-source basis to allow for a high degree of transferability and usability. On a neighbourhood scale medium spatial but high temporal and spectral resolution satellite images are analysed to outline areas of homogeneous urban structure. Following a proportional allocation scheme, for each urban structure type representative sample areas are selected for a more detailed analysis of the building stock with high resolution image data. On a building-by-building scale a ground-based, rapid visual survey is performed using an omnidirectional imaging system driven around with a car inside the identified sample areas. Processing of the acquired images allows for an extraction of vulnerability-related features of single buildings (e.g. building height, detection of soft-storeys). An analysis of high resolution satellite images provides with further inventory features (e.g. footprint area, shape irregularity). Since we are dealing with information coming from

  1. The SeIsmic monitoring and vulneraBilitY framework for civiL protection (SIBYL) Project: An overview and preliminary results

    NASA Astrophysics Data System (ADS)

    Fleming, Kevin; Parolai, Stefano; Iervolino, Iunio; Pitilakis, Kyriazis; Petryna, Yuriy

    2016-04-01

    The SIBYL project is setting out to contribute to enhancing the capacity of Civil Protection (CP) authorities to rapidly and cost-effectively assess the seismic vulnerability of the built environment. The reason for this arises from the occurrence of seismic swarms or foreshocks, which leads to the requirement that CP authorities must rapidly assess the threatened area's vulnerability. This is especially important for those regions where there is a dearth of up-to-date and reliable information. The result will be a multi-faceted framework, made up of methodologies and software tools, that provides information to advise decision makers as to the most appropriate preventative actions to be taken. It will cover cases where there is a need for short-notice vulnerability assessment in a pre-event situation, and the monitoring of the built environment's dynamic vulnerability during a seismic sequence. Coupled with this will be the ability to stimulate long-term management plans, independent of the hazard or disaster of concern. The monitoring itself will involve low-cost sensing units which may be easily installed in critical infrastructures. The framework will be flexible enough to be employed over multiple spatial scales, and it will be developed with a modular structure which will ease its applicability to other natural hazard types. Likewise, it will be able to be adapted to the needs of CP authorities in different countries within their own hazard context. This presentation therefore provides an overview of the aims and expected outcomes of SIBYL, while explaining the tools currently being developed and refined, as well as preliminary results of several field campaigns.

  2. Seismic analysis for translational failure of landfills with retaining walls.

    PubMed

    Feng, Shi-Jin; Gao, Li-Ya

    2010-11-01

    In the seismic impact zone, seismic force can be a major triggering mechanism for translational failures of landfills. The scope of this paper is to develop a three-part wedge method for seismic analysis of translational failures of landfills with retaining walls. The approximate solution of the factor of safety can be calculated. Unlike previous conventional limit equilibrium methods, the new method is capable of revealing the effects of both the solid waste shear strength and the retaining wall on the translational failures of landfills during earthquake. Parameter studies of the developed method show that the factor of safety decreases with the increase of the seismic coefficient, while it increases quickly with the increase of the minimum friction angle beneath waste mass for various horizontal seismic coefficients. Increasing the minimum friction angle beneath the waste mass appears to be more effective than any other parameters for increasing the factor of safety under the considered condition. Thus, selecting liner materials with higher friction angle will considerably reduce the potential for translational failures of landfills during earthquake. The factor of safety gradually increases with the increase of the height of retaining wall for various horizontal seismic coefficients. A higher retaining wall is beneficial to the seismic stability of the landfill. Simply ignoring the retaining wall will lead to serious underestimation of the factor of safety. Besides, the approximate solution of the yield acceleration coefficient of the landfill is also presented based on the calculated method.

  3. Joint analysis of the seismic data and velocity gravity model

    NASA Astrophysics Data System (ADS)

    Belyakov, A. S.; Lavrov, V. S.; Muchamedov, V. A.; Nikolaev, A. V.

    2016-03-01

    We performed joint analysis of the seismic noises recorded at the Japanese Ogasawara station located on Titijima Island in the Philippine Sea using the STS-2 seismograph at the OSW station in the winter period of January 1-15, 2015, over the background of a velocity gravity model. The graphs prove the existence of a cause-and-effect relation between the seismic noise and gravity and allow us to consider it as a desired signal.

  4. Grandiose and vulnerable narcissism: a nomological network analysis.

    PubMed

    Miller, Joshua D; Hoffman, Brian J; Gaughan, Eric T; Gentile, Brittany; Maples, Jessica; Keith Campbell, W

    2011-10-01

    Evidence has accrued to suggest that there are 2 distinct dimensions of narcissism, which are often labeled grandiose and vulnerable narcissism. Although individuals high on either of these dimensions interact with others in an antagonistic manner, they differ on other central constructs (e.g., Neuroticism, Extraversion). In the current study, we conducted an exploratory factor analysis of 3 prominent self-report measures of narcissism (N=858) to examine the convergent and discriminant validity of the resultant factors. A 2-factor structure was found, which supported the notion that these scales include content consistent with 2 relatively distinct constructs: grandiose and vulnerable narcissism. We then compared the similarity of the nomological networks of these dimensions in relation to indices of personality, interpersonal behavior, and psychopathology in a sample of undergraduates (n=238). Overall, the nomological networks of vulnerable and grandiose narcissism were unrelated. The current results support the need for a more explicit parsing of the narcissism construct at the level of conceptualization and assessment.

  5. The Algerian Seismic Network: Performance from data quality analysis

    NASA Astrophysics Data System (ADS)

    Yelles, Abdelkarim; Allili, Toufik; Alili, Azouaou

    2013-04-01

    Seismic monitoring in Algeria has seen a great change after the Boumerdes earthquake of May 21st, 2003. Indeed the installation of a New Digital seismic network (ADSN) upgrade drastically the previous analog telemetry network. During the last four years, the number of stations in operation has greatly increased to 66 stations with 15 Broad Band, 02 Very Broad band, 47 Short period and 21 accelerometers connected in real time using various mode of transmission ( VSAT, ADSL, GSM, ...) and managed by Antelope software. The spatial distribution of these stations covers most of northern Algeria from east to west. Since the operation of the network, significant number of local, regional and tele-seismic events was located by the automatic processing, revised and archived in databases. This new set of data is characterized by the accuracy of the automatic location of local seismicity and the ability to determine its focal mechanisms. Periodically, data recorded including earthquakes, calibration pulse and cultural noise are checked using PSD (Power Spectral Density) analysis to determine the noise level. ADSN Broadband stations data quality is controlled in quasi real time using the "PQLX" software by computing PDFs and PSDs of the recordings. Some other tools and programs allow the monitoring and the maintenance of the entire electronic system for example to check the power state of the system, the mass position of the sensors and the environment conditions (Temperature, Humidity, Air Pressure) inside the vaults. The new design of the network allows management of many aspects of real time seismology: seismic monitoring, rapid determination of earthquake, message alert, moment tensor estimation, seismic source determination, shakemaps calculation, etc. The international standards permit to contribute in regional seismic monitoring and the Mediterranean warning system. The next two years with the acquisition of new seismic equipment to reach 50 new BB stations led to

  6. Kernel Smoothing Methods for Non-Poissonian Seismic Hazard Analysis

    NASA Astrophysics Data System (ADS)

    Woo, Gordon

    2017-04-01

    For almost fifty years, the mainstay of probabilistic seismic hazard analysis has been the methodology developed by Cornell, which assumes that earthquake occurrence is a Poisson process, and that the spatial distribution of epicentres can be represented by a set of polygonal source zones, within which seismicity is uniform. Based on Vere-Jones' use of kernel smoothing methods for earthquake forecasting, these methods were adapted in 1994 by the author for application to probabilistic seismic hazard analysis. There is no need for ambiguous boundaries of polygonal source zones, nor for the hypothesis of time independence of earthquake sequences. In Europe, there are many regions where seismotectonic zones are not well delineated, and where there is a dynamic stress interaction between events, so that they cannot be described as independent. From the Amatrice earthquake of 24 August, 2016, the subsequent damaging earthquakes in Central Italy over months were not independent events. Removing foreshocks and aftershocks is not only an ill-defined task, it has a material effect on seismic hazard computation. Because of the spatial dispersion of epicentres, and the clustering of magnitudes for the largest events in a sequence, which might all be around magnitude 6, the specific event causing the highest ground motion can vary from one site location to another. Where significant active faults have been clearly identified geologically, they should be modelled as individual seismic sources. The remaining background seismicity should be modelled as non-Poissonian using statistical kernel smoothing methods. This approach was first applied for seismic hazard analysis at a UK nuclear power plant two decades ago, and should be included within logic-trees for future probabilistic seismic hazard at critical installations within Europe. In this paper, various salient European applications are given.

  7. A vulnerability analysis for a drought vulnerable catchment in South-Eastern Austria

    NASA Astrophysics Data System (ADS)

    Hohmann, Clara; Kirchengast, Gottfried; Birk, Steffen

    2016-04-01

    To detect uncertainties and thresholds in a drought vulnerable region we focus on a typical river catchment of the Austrian South-Eastern Alpine forelands with good data availability, the Raab valley. This mid-latitude region in the south-east of the Austrian state Styria (˜ 47° N, ˜ 16° E) exhibits a strong temperature increase over the last decades. Especially the mean summer temperatures (June to August) show a strong increase (˜ 0.7 °C per decade) over the last decades (1971 - 2015) (Kabas et al., Meteorol. Z. 20, 277-289, 2011; pers. comm., 2015). The Styrian Raab valley, with a catchment size of 986 km2, has already struggled with drought periods (e.g., summers of 1992, 2001 and 2003). Thus, it is important to know what happens if warm and dry periods occur more frequently. Therefore we analyze which sensitivities and related uncertainties exist, which thresholds might be crossed, and what the effects on the different components of the water balance equation are, in particular on runoff, soil moisture, groundwater recharge, and evapotranspiration. We use the mainly physics-based hydrological Water Flow and Balance Simulation Model (WaSiM), developed at ETH Zurich (Schulla, Diss., ETH Zurich, CH, 1997). The model is well established and widely used for hydrological modeling at a diversity of spatial and temporal resolutions. We choose a model set up which is as simple as possible but as complex as necessary to perform sensitivity studies on uncertainties and thresholds in the context of climate change. In order to assess the model performance under a wide range of conditions, the calibration and validation is performed with a split sample for dry and wet periods. With the calibrated and validated model we perform a low-flow vulnerability analysis ("stress test"), with focus on drought-related conditions. Therefore we simulate changes in weather and climate (e.g., 20% and 50% less precipitation, 2 °C and 5 °C higher temperature), changes in land use and

  8. Seismic Hazard analysis of Adjaria Region in Georgia

    NASA Astrophysics Data System (ADS)

    Jorjiashvili, Nato; Elashvili, Mikheil

    2014-05-01

    The most commonly used approach to determining seismic-design loads for engineering projects is probabilistic seismic-hazard analysis (PSHA). The primary output from a PSHA is a hazard curve showing the variation of a selected ground-motion parameter, such as peak ground acceleration (PGA) or spectral acceleration (SA), against the annual frequency of exceedance (or its reciprocal, return period). The design value is the ground-motion level that corresponds to a preselected design return period. For many engineering projects, such as standard buildings and typical bridges, the seismic loading is taken from the appropriate seismic-design code, the basis of which is usually a PSHA. For more important engineering projects— where the consequences of failure are more serious, such as dams and chemical plants—it is more usual to obtain the seismic-design loads from a site-specific PSHA, in general, using much longer return periods than those governing code based design. Calculation of Probabilistic Seismic Hazard was performed using Software CRISIS2007 by Ordaz, M., Aguilar, A., and Arboleda, J., Instituto de Ingeniería, UNAM, Mexico. CRISIS implements a classical probabilistic seismic hazard methodology where seismic sources can be modelled as points, lines and areas. In the case of area sources, the software offers an integration procedure that takes advantage of a triangulation algorithm used for seismic source discretization. This solution improves calculation efficiency while maintaining a reliable description of source geometry and seismicity. Additionally, supplementary filters (e.g. fix a sitesource distance that excludes from calculation sources at great distance) allow the program to balance precision and efficiency during hazard calculation. Earthquake temporal occurrence is assumed to follow a Poisson process, and the code facilitates two types of MFDs: a truncated exponential Gutenberg-Richter [1944] magnitude distribution and a characteristic magnitude

  9. A new passive seismic method based on seismic interferometry and multichannel analysis of surface waves

    NASA Astrophysics Data System (ADS)

    Cheng, Feng; Xia, Jianghai; Xu, Yixian; Xu, Zongbo; Pan, Yudi

    2015-06-01

    We proposed a new passive seismic method (PSM) based on seismic interferometry and multichannel analysis of surface waves (MASW) to meet the demand for increasing investigation depth by acquiring surface-wave data at a low-frequency range (1 Hz ≤ f ≤ 10 Hz). We utilize seismic interferometry to sort common virtual source gathers (CVSGs) from ambient noise and analyze obtained CVSGs to construct 2D shear-wave velocity (Vs) map using the MASW. Standard ambient noise processing procedures were applied to the computation of cross-correlations. To enhance signal to noise ratio (SNR) of the empirical Green's functions, a new weighted stacking method was implemented. In addition, we proposed a bidirectional shot mode based on the virtual source method to sort CVSGs repeatedly. The PSM was applied to two field data examples. For the test along Han River levee, the results of PSM were compared with the improved roadside passive MASW and spatial autocorrelation method (SPAC). For test in the Western Junggar Basin, PSM was applied to a 70 km long linear survey array with a prominent directional urban noise source and a 60 km-long Vs profile with 1.5 km in depth was mapped. Further, a comparison about the dispersion measurements was made between PSM and frequency-time analysis (FTAN) technique to assess the accuracy of PSM. These examples and comparisons demonstrated that this new method is efficient, flexible, and capable to study near-surface velocity structures based on seismic ambient noise.

  10. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  11. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS OF HANFORD DOUBLE SHELL TANKS

    SciTech Connect

    MACKEY, T.C.

    2006-03-17

    M&D Professional Services, Inc. (M&D) is under subcontract to Pacific Northwest National Laboratory (PNNL) to perform seismic analysis of the Hanford Site double-shell tanks (DSTs) in support of a project entitled ''Double-Shell Tank (DSV Integrity Project--DST Thermal and Seismic Analyses)''. The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST system at Hanford in support of Tri-Party Agreement Milestone M-48-14, The work described herein was performed in support of the seismic analysis of the DSTs. The thermal and operating loads analysis of the DSTs is documented in Rinker et al. (2004). The work statement provided to M&D (PNNL 2003) required that the seismic analysis of the DSTs assess the impacts of potentially non-conservative assumptions in previous analyses and account for the additional soil mass due to the as-found soil density increase, the effects of material degradation, additional thermal profiles applied to the full structure including the soil-structure response with the footings, the non-rigid (low frequency) response of the tank roof, the asymmetric seismic-induced soil loading, the structural discontinuity between the concrete tank wall and the support footing and the sloshing of the tank waste. The seismic analysis considers the interaction of the tank with the surrounding soil and the effects of the primary tank contents. The DSTs and the surrounding soil are modeled as a system of finite elements. The depth and width of the soil incorporated into the analysis model are sufficient to obtain appropriately accurate analytical results. The analyses required to support the work statement differ from previous analysis of the DSTs in that the soil-structure interaction (SSI) model includes several (nonlinear) contact surfaces in the tank structure, and the contained waste must be modeled explicitly in order to capture the fluid-structure interaction behavior between the primary tank and contained

  12. Weighted network analysis of earthquake seismic data

    NASA Astrophysics Data System (ADS)

    Chakraborty, Abhijit; Mukherjee, G.; Manna, S. S.

    2015-09-01

    Three different earthquake seismic data sets are used to construct the earthquake networks following the prescriptions of Abe and Suzuki (2004). It has been observed that different links of this network appear with highly different strengths. This prompted us to extend the study of earthquake networks by considering it as the weighted network. Different properties of such weighted network have been found to be quite different from those of their un-weighted counterparts.

  13. The Uncertainty in the Local Seismic Response Analysis

    SciTech Connect

    Pasculli, A.; Pugliese, A.; Romeo, R. W.; Sano, T.

    2008-07-08

    In the present paper is shown the influence on the local seismic response analysis exerted by considering dispersion and uncertainty in the seismic input as well as in the dynamic properties of soils. In a first attempt a 1D numerical model is developed accounting for both the aleatory nature of the input motion and the stochastic variability of the dynamic properties of soils. The seismic input is introduced in a non-conventional way through a power spectral density, for which an elastic response spectrum, derived--for instance--by a conventional seismic hazard analysis, is required with an appropriate level of reliability. The uncertainty in the geotechnical properties of soils are instead investigated through a well known simulation technique (Monte Carlo method) for the construction of statistical ensembles. The result of a conventional local seismic response analysis given by a deterministic elastic response spectrum is replaced, in our approach, by a set of statistical elastic response spectra, each one characterized by an appropriate level of probability to be reached or exceeded. The analyses have been carried out for a well documented real case-study. Lastly, we anticipate a 2D numerical analysis to investigate also the spatial variability of soil's properties.

  14. Seismic hazards in Thailand: a compilation and updated probabilistic analysis

    NASA Astrophysics Data System (ADS)

    Pailoplee, Santi; Charusiri, Punya

    2016-06-01

    A probabilistic seismic hazard analysis (PSHA) for Thailand was performed and compared to those of previous works. This PSHA was based upon (1) the most up-to-date paleoseismological data (slip rates), (2) the seismic source zones, (3) the seismicity parameters ( a and b values), and (4) the strong ground-motion attenuation models suggested as being suitable models for Thailand. For the PSHA mapping, both the ground shaking and probability of exceedance (POE) were analyzed and mapped using various methods of presentation. In addition, site-specific PSHAs were demonstrated for ten major provinces within Thailand. For instance, a 2 and 10 % POE in the next 50 years of a 0.1-0.4 g and 0.1-0.2 g ground shaking, respectively, was found for western Thailand, defining this area as the most earthquake-prone region evaluated in Thailand. In a comparison between the ten selected specific provinces within Thailand, the Kanchanaburi and Tak provinces had comparatively high seismic hazards, and therefore, effective mitigation plans for these areas should be made. Although Bangkok was defined as being within a low seismic hazard in this PSHA, a further study of seismic wave amplification due to the soft soil beneath Bangkok is required.

  15. A robust polynomial principal component analysis for seismic noise attenuation

    NASA Astrophysics Data System (ADS)

    Wang, Yuchen; Lu, Wenkai; Wang, Benfeng; Liu, Lei

    2016-12-01

    Random and coherent noise attenuation is a significant aspect of seismic data processing, especially for pre-stack seismic data flattened by normal moveout correction or migration. Signal extraction is widely used for pre-stack seismic noise attenuation. Principle component analysis (PCA), one of the multi-channel filters, is a common tool to extract seismic signals, which can be realized by singular value decomposition (SVD). However, when applying the traditional PCA filter to seismic signal extraction, the result is unsatisfactory with some artifacts when the seismic data is contaminated by random and coherent noise. In order to directly extract the desired signal and fix those artifacts at the same time, we take into consideration the amplitude variation with offset (AVO) property and thus propose a robust polynomial PCA algorithm. In this algorithm, a polynomial constraint is used to optimize the coefficient matrix. In order to simplify this complicated problem, a series of sub-optimal problems are designed and solved iteratively. After that, the random and coherent noise can be effectively attenuated simultaneously. Applications on synthetic and real data sets note that our proposed algorithm can better suppress random and coherent noise and have a better performance on protecting the desired signals, compared with the local polynomial fitting, conventional PCA and a L1-norm based PCA method.

  16. State of art of seismic design and seismic hazard analysis for oil and gas pipeline system

    NASA Astrophysics Data System (ADS)

    Liu, Aiwen; Chen, Kun; Wu, Jian

    2010-06-01

    The purpose of this paper is to adopt the uniform confidence method in both water pipeline design and oil-gas pipeline design. Based on the importance of pipeline and consequence of its failure, oil and gas pipeline can be classified into three pipe classes, with exceeding probabilities over 50 years of 2%, 5% and 10%, respectively. Performance-based design requires more information about ground motion, which should be obtained by evaluating seismic safety for pipeline engineering site. Different from a city’s water pipeline network, the long-distance oil and gas pipeline system is a spatially linearly distributed system. For the uniform confidence of seismic safety, a long-distance oil and pipeline formed with pump stations and different-class pipe segments should be considered as a whole system when analyzing seismic risk. Considering the uncertainty of earthquake magnitude, the design-basis fault displacements corresponding to the different pipeline classes are proposed to improve deterministic seismic hazard analysis (DSHA). A new empirical relationship between the maximum fault displacement and the surface-wave magnitude is obtained with the supplemented earthquake data in East Asia. The estimation of fault displacement for a refined oil pipeline in Wenchuan M S8.0 earthquake is introduced as an example in this paper.

  17. Rethinking vulnerability analysis and governance with emphasis on a participatory approach.

    PubMed

    Rossignol, Nicolas; Delvenne, Pierre; Turcanu, Catrinel

    2015-01-01

    This article draws on vulnerability analysis as it emerged as a complement to classical risk analysis, and it aims at exploring its ability for nurturing risk and vulnerability governance actions. An analysis of the literature on vulnerability analysis allows us to formulate a three-fold critique: first, vulnerability analysis has been treated separately in the natural and the technological hazards fields. This separation prevents vulnerability from unleashing the full range of its potential, as it constrains appraisals into artificial categories and thus already closes down the outcomes of the analysis. Second, vulnerability analysis focused on assessment tools that are mainly quantitative, whereas qualitative appraisal is a key to assessing vulnerability in a comprehensive way and to informing policy making. Third, a systematic literature review of case studies reporting on participatory approaches to vulnerability analysis allows us to argue that participation has been important to address the above, but it remains too closed down in its approach and would benefit from embracing a more open, encompassing perspective. Therefore, we suggest rethinking vulnerability analysis as one part of a dynamic process between opening-up and closing-down strategies, in order to support a vulnerability governance framework. © 2014 Society for Risk Analysis.

  18. Structural Identification And Seismic Analysis Of An Existing Masonry Building

    SciTech Connect

    Del Monte, Emanuele; Galano, Luciano; Ortolani, Barbara; Vignoli, Andrea

    2008-07-08

    The paper presents the diagnostic investigation and the seismic analysis performed on an ancient masonry building in Florence. The building has historical interest and is subjected to conservative restrictions. The investigation involves a preliminary phase concerning the research of the historic documents and a second phase of execution of in situ and laboratory tests to detect the mechanical characteristics of the masonry. This investigation was conceived in order to obtain the 'LC2 Knowledge Level' and to perform the non-linear pushover analysis according to the new Italian Standards for seismic upgrading of existing masonry buildings.

  19. Analysis of the seismicity preceding large earthquakes

    NASA Astrophysics Data System (ADS)

    Stallone, Angela; Marzocchi, Warner

    2017-04-01

    The most common earthquake forecasting models assume that the magnitude of the next earthquake is independent from the past. This feature is probably one of the most severe limitations of the capability to forecast large earthquakes. In this work, we investigate empirically on this specific aspect, exploring whether variations in seismicity in the space-time-magnitude domain encode some information on the size of the future earthquakes. For this purpose, and to verify the stability of the findings, we consider seismic catalogs covering quite different space-time-magnitude windows, such as the Alto Tiberina Near Fault Observatory (TABOO) catalogue, the California and Japanese seismic catalog. Our method is inspired by the statistical methodology proposed by Baiesi & Paczuski (2004) and elaborated by Zaliapin et al. (2008) to distinguish between triggered and background earthquakes, based on a pairwise nearest-neighbor metric defined by properly rescaled temporal and spatial distances. We generalize the method to a metric based on the k-nearest-neighbors that allows us to consider the overall space-time-magnitude distribution of k-earthquakes, which are the strongly correlated ancestors of a target event. Finally, we analyze the statistical properties of the clusters composed by the target event and its k-nearest-neighbors. In essence, the main goal of this study is to verify if different classes of target event magnitudes are characterized by distinctive "k-foreshocks" distributions. The final step is to show how the findings of this work may (or not) improve the skill of existing earthquake forecasting models.

  20. Information Assurance Technology AnaLysis Center. Information Assurance Tools Report. Vulnerability Analysis

    DTIC Science & Technology

    1998-01-01

    Information Assurance Tools Report Vulnerability Analysis 6. AUTHOR(S) IATAC 5 . FUNDING NUMBERS SPO700-97-R-0603 7. PERFORMING ORGANIZATION NAME(S...Collection 3 Tool Classification 3 Tool Sources 3 Database Structure 5 Tool Selection Criteria 5 Results 5 Summary of Vulnerability Analysis Tools 6...www.giga.or.at/pub/ hacker/unix BALLISTA TITLE Ballista AUTHOR Secure Networks Inc. SOURCE http://www.secnet.com/ nav1 b.html KEYWORDS comprehensive

  1. A preliminary analysis of quantifying computer security vulnerability data in "the wild"

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; McNamara, Sean R.; Goldstein, Adam; Cybenko, George

    2016-05-01

    A system of computers, networks and software has some level of vulnerability exposure that puts it at risk to criminal hackers. Presently, most vulnerability research uses data from software vendors, and the National Vulnerability Database (NVD). We propose an alternative path forward through grounding our analysis in data from the operational information security community, i.e. vulnerability data from "the wild". In this paper, we propose a vulnerability data parsing algorithm and an in-depth univariate and multivariate analysis of the vulnerability arrival and deletion process (also referred to as the vulnerability birth-death process). We find that vulnerability arrivals are best characterized by the log-normal distribution and vulnerability deletions are best characterized by the exponential distribution. These distributions can serve as prior probabilities for future Bayesian analysis. We also find that over 22% of the deleted vulnerability data have a rate of zero, and that the arrival vulnerability data is always greater than zero. Finally, we quantify and visualize the dependencies between vulnerability arrivals and deletions through a bivariate scatterplot and statistical observations.

  2. CORSSA: The Community Online Resource for Statistical Seismicity Analysis

    USGS Publications Warehouse

    Michael, Andrew J.; Wiemer, Stefan

    2010-01-01

    Statistical seismology is the application of rigorous statistical methods to earthquake science with the goal of improving our knowledge of how the earth works. Within statistical seismology there is a strong emphasis on the analysis of seismicity data in order to improve our scientific understanding of earthquakes and to improve the evaluation and testing of earthquake forecasts, earthquake early warning, and seismic hazards assessments. Given the societal importance of these applications, statistical seismology must be done well. Unfortunately, a lack of educational resources and available software tools make it difficult for students and new practitioners to learn about this discipline. The goal of the Community Online Resource for Statistical Seismicity Analysis (CORSSA) is to promote excellence in statistical seismology by providing the knowledge and resources necessary to understand and implement the best practices, so that the reader can apply these methods to their own research. This introduction describes the motivation for and vision of CORRSA. It also describes its structure and contents.

  3. A study aid for seismic data interpretation and analysis

    NASA Astrophysics Data System (ADS)

    Seok, R.; Lee, Y.; Lee, B.; Lee, G.

    2011-12-01

    We present the workflow for 3-D seismic data interpretation and analysis that is routinely performed throughout the exploration phase in the industry. The workflow is used as a study aid for the first-year graduate students in the Department of Energy Resources Engineering at Pukyong National University, Busan, Korea. The data used in this work consist of 3-D seismic and well-log data from the Sooner field, Colorado, USA and 2-D and 3-D seismic data from the Penobscot surveys carried out in the Scotian shelf, Canada. The Sooner field data are part of the tutorial data sets of Kingdom Suite° which was used for data interpretation and mapping. The Penobscot data are available from the OpendTect°'s website. OpendTect° was used for seismic attribute generation and Hampson-Russell° was used for amplitude variation with offset (AVO) analysis and inversion. The workflow includes: (1) structural interpretation and mapping and 3-D visualization; (2) time-depth conversion; (3) (sequence) stratigraphic analysis; (4) attribute analysis and 3-D visualization; (5) quantitative analysis (e.g., AVO, inversion); and (6) volumetric calculations.

  4. Principal Component Analysis for pattern recognition in volcano seismic spectra

    NASA Astrophysics Data System (ADS)

    Unglert, Katharina; Jellinek, A. Mark

    2016-04-01

    Variations in the spectral content of volcano seismicity can relate to changes in volcanic activity. Low-frequency seismic signals often precede or accompany volcanic eruptions. However, they are commonly manually identified in spectra or spectrograms, and their definition in spectral space differs from one volcanic setting to the next. Increasingly long time series of monitoring data at volcano observatories require automated tools to facilitate rapid processing and aid with pattern identification related to impending eruptions. Furthermore, knowledge transfer between volcanic settings is difficult if the methods to identify and analyze the characteristics of seismic signals differ. To address these challenges we have developed a pattern recognition technique based on a combination of Principal Component Analysis and hierarchical clustering applied to volcano seismic spectra. This technique can be used to characterize the dominant spectral components of volcano seismicity without the need for any a priori knowledge of different signal classes. Preliminary results from applying our method to volcanic tremor from a range of volcanoes including K¯ı lauea, Okmok, Pavlof, and Redoubt suggest that spectral patterns from K¯ı lauea and Okmok are similar, whereas at Pavlof and Redoubt spectra have their own, distinct patterns.

  5. Elastic structure and seismicity of Donegal (Ireland): insights from passive seismic analysis

    NASA Astrophysics Data System (ADS)

    Piana Agostinetti, Nicola

    2016-04-01

    Ireland's crust is the result of a complex geological history, which began in the Palaeozoic with the oblique closure of the Iapetus Ocean and, probably, it is still on-going. In the northwestern portion of the island, the geology of Donegal has been the subject of detailed geological investigation by many workers in the last century. The most widely represented rock types in Donegal are metasediments of Dalradian and Moinian age, invaded by several granites of Caledonian age (so called Donegal granite). Smaller and separate intrusions are present (e.g. Fanad Head). On the contrary, it is widely accepted that the the deep crustal structure of the northern portion of Ireland has been re-worked in more recent time. The several phases of lithospheric stretching associated to the opening of the Atlantic ocean interested such portion of Ireland, with the extrusion of flood basalts. Moreover, the presence of a hot, low-density asthenospheric plume spreading from Iceland has been suggested, with the formation of a thick high-velocity layer of magmatic underplated material at the base of the crust. Oddly, at present, Donegal is the only seismically active area in Ireland, with an average rate of one Mw=2-3 event every 3-4 years. In the last three years, passive seismic data have been recorded at 12 seismic stations deployed across the most seismically active area in Co. Donegal, with the aim of reconstructing the seismic structure down to the upper-mantle depth and of locating the microseismic activity within investigating volume. Both local and teleseismic events were recorded giving the opportunity of integrating results form different techniques for seismic data analysis, and jointly interpret them together with surface geology and mapped fault traces. Local events have been used to define constrain faulting volumes, focal mechanisms and to reconstruct a low-resolution 3D Vp and VpVs velocity models. Teleseismic events have been used to compute receiver function data

  6. Seismic Response Analysis and Design of Structure with Base Isolation

    SciTech Connect

    Rosko, Peter

    2010-05-21

    The paper reports the study on seismic response and energy distribution of a multi-story civil structure. The nonlinear analysis used the 2003 Bam earthquake acceleration record as the excitation input to the structural model. The displacement response was analyzed in time domain and in frequency domain. The displacement and its derivatives result energy components. The energy distribution in each story provides useful information for the structural upgrade with help of added devices. The objective is the structural displacement response minimization. The application of the structural seismic response research is presented in base-isolation example.

  7. New Foundations for Tank Vulnerability Analysis (with 1991 Appendix)

    DTIC Science & Technology

    1991-05-01

    MONITORING AGENCY NAME(S) AND ADORESS(ES) 10. SPONSORING/MONITORING AGENCY REPORT NUMBER US Army Balistic Research Laboratory ATlN: SLCBR-DD-T BRL--R... armored vehicle vulnerability; damage; PC vulnerability; simulation 6. PC CODE 17. SECURITY CLASSIFICATION 𔃻. SECURITY CLASSIFICATION 19. SECURITY...tactical significance. I do not think that a few small groups of armor experts and vulnerability analysts should be given the implicit responsibility for

  8. Appalachian Play Fairway Analysis Seismic Hazards Supporting Data

    SciTech Connect

    Frank Horowitz

    2016-07-20

    These are the data used in estimating the seismic hazards (both natural and induced) for candidate direct use geothermal locations in the Appalachian Basin Play Fairway Analysis by Jordan et al. (2015). xMin,yMin -83.1407,36.7461 : xMax,yMax -71.5175,45.1729

  9. A concept analysis of women's vulnerability during pregnancy, birth and the postnatal period.

    PubMed

    Briscoe, Lesley; Lavender, Tina; McGowan, Linda

    2016-10-01

    To report an analysis of the concept of vulnerability associated with pregnancy, birth and the postnatal period. The concept of vulnerability during childbirth is complex and the term, 'to be vulnerable' frequently attains a vague application. Analysis about vulnerability is needed to guide policy, practice, education and research. Clarity around the concept has the potential to improve outcomes for women. Concept analysis. Searches were conducted in CINAHL, EMBASE, PubMed, Psychinfo, MEDLINE, MIDIRS and ASSIA and limited to between January 2000 - June 2014. Data were collected over 12 months during 2014. This concept analysis drew on Morse's qualitative methods. Vulnerability during pregnancy, birth and the postnatal period can be defined by three main attributes: (a) Threat; (b) Barrier; and (c) Repair. Key attributes have the potential to influence outcome for women. Inseparable sub-attributes such as mother and baby attachment, the woman's free will and choice added a level of complexity about the concept. This concept analysis has clarified how the term vulnerability is currently understood and used in relation to pregnancy, birth and the postnatal period. Vulnerability should be viewed as a complex phenomenon rather than a singular concept. A 'vulnerability journey plan' has the potential to identify how reparative interventions may develop the woman's capacity for resilience and influence the degree of vulnerability experienced. Methodology based around complex theory should be explored in future work about vulnerability. © 2016 John Wiley & Sons Ltd.

  10. Challenges to Seismic Hazard Analysis of Critical Infrastructures

    NASA Astrophysics Data System (ADS)

    Klügel, J.

    2005-12-01

    Based on the background of the review of a large scale probabilistic seismic hazard analysis (PSHA) performed in Switzerland for the sites of Swiss nuclear power plants- the PEGASOS project (2000-2004) - challenges to seismic hazard analysis of critical infrastructures from the perspective of a professional safety analyst are discussed. The PEGASOS study was performed to provide a meaningful input for the update of the plant specific PRAs (Probabilistic Risk Assessment) of Swiss nuclear power plants. Earlier experience had shown that the results of these studies to a large extend are driven by the results of the seismic hazard analysis. The PEGASOS-study was performed in full compliance with the procedures developed by the Senior Seismic Hazard Analysis Committee (SSHAC) of U.S.A (SSHAC, 1997) developed for the treatment of uncertainties by the use of a structured expert elicitation process. The preliminary results derived from the project did show an unexpected amount of uncertainty and were regarded as not suitable for direct application. A detailed review of the SSHAC-methodology revealed a number of critical issues with respect to the treatment of uncertainties and the mathematical models applied, which will be presented in the paper. The most important issued to be discussed are: * The ambiguous solution of PSHA-logic trees * The inadequate mathematical treatment of the results of expert elicitations based on the assumption of bias free expert estimates * The problems associated with the "think model" of the separation of epistemic and aleatory uncertainties * The consequences of the ergodic assumption used to justify the transfer of attenuation equations of other regions to the region of interest. Based on these observations methodological questions with respect to the development of a risk-consistent design basis for new nuclear power plants as required by the U.S. NRC RG 1.165 will be evaluated. As an principal alternative for the development of a

  11. Subsalt risk reduction using seismic sequence stratigraphic analysis

    SciTech Connect

    Wornardt, W.W. Jr.

    1994-12-31

    Several recent projects involving detailed seismic sequence stratigraphic analysis of existing wells near subsalt prospects in the south additions of the offshore Louisiana area in the Gulf of Mexico have demonstrated the utility of using seismic sequence stratigraphic analysis to reduce risk when drilling subsalt plays. First, the thick section of sedimentary rocks that was though to be above and below the salt was penetrated in the area away from the salt. These sedimentary rocks were accurately dated using maximum flooding surface first occurrence downhole of important bioevent, condensed sections, abundance and diversity histograms, and high-resolution biostratigraphy while the wells were being drilled. Potential reservoir sandstones within specific Vail sequences in these wells were projected using seismic data up to the subsalt and non-subsalt sediment interface. The systems tract above and below the maximum flooding surface and the type of reservoir sandstones that were to be encounterd were predictable based on the paleobathymetry, increase and decrease of fauna and flora, recognition of the bottom-set turbidite, slope fan and basin floor fan condensed sections, and superpositional relationship of the Vail sequences and systems tracts to provide a detailed sequence stratigraphic analysis of the well. Subsequently, wells drilled through the salt could be accurately correlated with Vail sequences and systems tracts in wells that were previously correlated away from the salt layer with seismic reflection profiles.

  12. Subsalt risk reduction using seismic sequence-stratigraphic analysis

    SciTech Connect

    Wornardt, W.W. Jr.

    1994-09-01

    Several recent projects involving detailed seismic-sequence stratigraphic analysis of existing wells near subsalt prospects in the south additions of the offshore Louisiana area in the Gulf of Mexico have demonstrated the utility of using seismic sequence-stratigraphic analysis to reduce risk when drilling subsalt plays. First, the thick section of sediments that was thought to be above and below the salt was penetrated in the area away from the salt. These sediments were accurately dated using maximum flooding surface first occurrence downhole of important bioevent, condensed sections, abundance and diversity histograms, and high-resolution biostratigraphy while the wells were being drilled. Potential reservoir sands within specific Vail sequences in these wells were projected on seismic up to the subsalt and non-subsalt sediment interface. The systems tract above and below the maximum flooding surface and the type of reservoir sands that were to be encountered were predictable based on the paleobathymetry, increase and decrease of fauna and flora abundance, recognition of the bottom-set turbidite, slope fan and basin floor fan condensed sections, and superpositional relationship of the Vail sequences and systems tracts to provide a detailed sequence-stratigraphic analysis of the well in question. Subsequently, the wells drilled through the salt could be accurately correlated with the Vail sequences and systems tracts in wells that were previously correlated with seismic reflection profiles away from the salt layer.

  13. Development and implementation of a GIS-based tool for spatial modeling of seismic vulnerability of Tehran

    NASA Astrophysics Data System (ADS)

    Hashemi, M.; Alesheikh, A. A.

    2012-12-01

    Achieving sustainable development in countries prone to earthquakes is possible with taking effective measures to reduce vulnerability to earthquakes. In this context, damage assessment of hypothetical earthquakes and planning for disaster management are important issues. Having a computer tool capable of estimating structural and human losses from earthquakes in a specific region may facilitate the decision-making process before and during disasters. Interoperability of this tool with wide-spread spatial analysis frameworks will expedite the data transferring process. In this study, the earthquake damage assessment (EDA) software tool is developed as an embedded extension within a GIS (geographic information system) environment for the city of Tehran, Iran. This GIS-based extension provides users with a familiar environment to estimate and observe the probable damages and fatalities of a deterministic earthquake scenario. The productivity of this tool is later demonstrated for southern Karoon parish, Region 10, Tehran. Three case studies for three active faults in the area and a comparison of the results with other research substantiated the reliability of this tool for additional earthquake scenarios.

  14. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    NASA Astrophysics Data System (ADS)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  15. Seismic Stability Analysis of a Himalayan Rock Slope

    NASA Astrophysics Data System (ADS)

    Latha, Gali Madhavi; Garaga, Arunakumari

    2010-11-01

    The seismic slope stability analysis of the right abutment of a railway bridge proposed at about 350 m above the ground level, crossing a river and connecting two huge hillocks in the Himalayas, India, is presented in this paper. The rock slopes are composed of highly jointed rock mass and the joint spacing and orientation are varying at different locations. Seismic slope stability analysis of the slope under consideration is carried out using both pseudo-static approach and time response approach as the site is located in seismic zone V as per the earth quake zonation maps of India. Stability of the slope is studied numerically using program FLAC. The results obtained from the pseudo-static analysis are presented in the form of Factor of Safety (FOS) and the results obtained from the time response analysis of the slope are presented in terms of horizontal and vertical displacements along the slope. The results obtained from both the analyses confirmed the global stability of the slope as the FOS in case of pseudo-static analysis is above 1.0 and the displacements observed in case of time response analysis are within the permissible limits. This paper also presents the results obtained from the parametric analysis performed in the case of time response analysis in order to understand the effect of individual parameters on the overall stability of the slope.

  16. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-01-01

    This report presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  17. A graph-based system for network-vulnerability analysis

    SciTech Connect

    Swiler, L.P.; Phillips, C.

    1998-06-01

    This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.

  18. A graph-based network-vulnerability analysis system

    SciTech Connect

    Swiler, L.P.; Phillips, C.; Gaylor, T.

    1998-05-03

    This paper presents a graph based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The analysis system requires as input a database of common attacks, broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level of effort for the attacker, various graph algorithms such as shortest path algorithms can identify the attack paths with the highest probability of success.

  19. A situational analysis of priority disaster hazards in Uganda: findings from a hazard and vulnerability analysis.

    PubMed

    Mayega, R W; Wafula, M R; Musenero, M; Omale, A; Kiguli, J; Orach, G C; Kabagambe, G; Bazeyo, W

    2013-06-01

    Most countries in sub-Saharan Africa have not conducted a disaster risk analysis. Hazards and vulnerability analyses provide vital information that can be used for development of risk reduction and disaster response plans. The purpose of this study was to rank disaster hazards for Uganda, as a basis for identifying the priority hazards to guide disaster management planning. The study as conducted in Uganda, as part of a multi-country assessment. A hazard, vulnerability and capacity analysis was conducted in a focus group discussion of 7 experts representing key stakeholder agencies in disaster management in Uganda. A simple ranking method was used to rank the probability of occurance of 11 top hazards, their potential impact and the level vulnerability of people and infrastructure. In-terms of likelihood of occurance and potential impact, the top ranked disaster hazards in Uganda are: 1) Epidemics of infectious diseases, 2) Drought/famine, 3) Conflict and environmental degradation in that order. In terms of vulnerability, the top priority hazards to which people and infrastructure were vulnerable were: 1) Conflicts, 2) Epidemics, 3) Drought/famine and, 4) Environmental degradation in that order. Poverty, gender, lack of information, and lack of resilience measures were some of the factors promoting vulnerability to disasters. As Uganda develops a disaster risk reduction and response plan, it ought to prioritize epidemics of infectious diseases, drought/famine, conflics and environmental degradation as the priority disaster hazards.

  20. Identifying Shallow Gas Reservoir Using 2D Seismic data and Seismic Attribute Analysis over Shahbazpur Structure, Bhola, Southern Bangladesh.

    NASA Astrophysics Data System (ADS)

    Rahman, M.; Imam, B.; Kabir, S. M. M.; Mustaque, S.; Gazi, M. Y.

    2016-12-01

    The Shahbazpur structure is a subsurface anticlinal structure situated in the middle of the Bhola Island in the northern margin of Hatia trough of Bengal Foredeep. Bangladesh petroleum exploration and production company Ltd. (BAPEX) discovered the Shahbazpur gas field in its exploration well Shahbazpur-1, in which commercial gas pools were tested positively from depth range of 3154m to 3212m below surface. A method is established to delineate the structural mapping precisely by interpreting Eight 2D seismic lines that are acquired over Shahbazpur structure. Moreover direct hydrocarbon indicators (DHI) related attributes are analyzed for further confirmation for presence of hydrocarbon in shallow to moderate depth. To do this synthetic seismogram generation, seismic to well tie, velocity modelling and depth conversion are performed. A limited number of seismic attributes functions that are available in an academic version of Petrel software are applied to analyze attributes. Seismic attribute analyses that are used in this interpretation mainly are associated to bright spot detection. Seismic indication of gas accumulation in 2D seismic line; RMS amplitude and Envelope attribute map from seismic attribute analysis shows presence of bright spots or high amplitude anomaly above the present Shahbazpur structure reservoir zone. This signature will play a very important role in next well planning on the same structure to test the shallow accumulation of hydrocarbon. For better understanding of this shallow reserve, it is suggested to acquire 3D seismic data over Shahbazpur structure which will help to evaluate the hydrocarbon accumulation and to identify gas migration pathways.

  1. Seismic Earth: Array Analysis of Broadband Seismograms

    NASA Astrophysics Data System (ADS)

    Levander, Alan; Nolet, Guust

    Seismology is one of the few means available to Earth scientists for probing the mechanical structure of the Earth's interior. The advent of modern seismic instrumentation at the end of the 19th century and its installation across the globe was shortly followed by mankind's first general understanding of the Earth's interior: The Croatian seismologist Andrija Mohorovičić discovered the crust-mantle boundary in central Europe in 1909, the German Beno Gutenberg determined the radius of the Earth's core in 1913, Great Britian's Sir Harold Jeffreys established its fluid character by 1926, and the Dane Inge Lehman discovered the solid inner core in 1936. It is notable that seismology, even in its earliest days, was an international science. Unlike much of the Earth sciences, seismology has its roots in physics, notably optics (many university seismology programs are, or initially were, attached to meteorology, astronomy, or physics departments), and draws from the literatures of imaging systems and statistical communications theory developed by, or employed in, astronomy, electrical engineering, medicine, ocean acoustics, and nondestructive materials testing. Seismology has close ties to petro-physics and mineral physics, the measurements of the disciplines being compared to infer the chemical and physical structure of the Earth's interior.

  2. Variability and Uncertainty in Probabilistic Seismic Hazard Analysis for the Island of Montreal

    NASA Astrophysics Data System (ADS)

    Elkady, Ahmed Mohamed Ahmed

    The current seismic design process for structures in Montreal is based on the 2005 edition of the National Building Code of Canada (NBCC 2005) which is based on a hazard level corresponding to a probability of exceedence of 2% in 50 years. The code is based on the Uniform Hazard Spectrum (UHS) and deaggregation values obtained by Geological Survey of Canada (GSC) modified version of F-RISK software and were obtained by a process that did not formally consider epistemic uncertainty. Epistemic uncertainty is related to the uncertainty in model formulation. A seismological model consists of seismic sources (source geometry, source location, recurrence rate, magnitude distribution, and maximum magnitude) and a Ground-Motion Prediction Equation (GMPE). In general, and particularly Montreal, GMPEs are the main source of epistemic uncertainty with respect to other variables of seismological the model. The objective of this thesis is to use CRISIS software to investigate the effect of epistemic uncertainty on probabilistic seismic hazard analysis (PSHA) products like the UHS and deaggregation values by incorporating different new GMPEs. The epsilon "epsilon" parameter is also discussed which represents the departure of the target ground motion from that predicted by the GMPE as it is not very well documented in Eastern Canada. A method is proposed to calculate epsilon values for Montreal relative to a given GMPE and to calculate robust weighted modal epsilon values when epistemic uncertainty is considered. Epsilon values are commonly used in seismic performance evaluations for identifying design events and selecting ground motion records for vulnerability and liquefaction studies. A brief overview of record epsilons is also presented which accounts for the spectral shape of the ground motion time history is also presented.

  3. The hazard in using probabilistic seismic hazard analysis

    SciTech Connect

    Krinitzsky, E.L. . Geotechnical Lab.)

    1993-11-01

    Earthquake experts rely on probabilistic seismic hazard analysis for everything from emergency-response planning to development of building codes. Unfortunately, says the author, the analysis is defective for the large earthquakes that pose the greater risks. Structures have short lifetimes and the distance over which earthquakes cause damage are relatively small. Exceptions serve to prove the rule. To be useful in engineering, earthquakes hazard assessment must focus narrowly in both time and space.

  4. An integrated analysis of controlled- and passive source seismic data

    NASA Astrophysics Data System (ADS)

    Rumpfhuber, Eva-Maria

    This dissertation consists of two parts, which include a study using passive source seismic data, and one using the dataset from a large-scale refraction/wide-angle reflection seismic experiment as the basis for an integrated analysis. The goal of the dissertation is the integration of the two different datasets and a combined interpretation of the results of the "Continental Dynamics of the Rocky Mountains" (CD-ROM) 1999 seismic experiment. I have determined the crustal structure using four different receiver function methods using data collected from the northern transect of the CD-ROM passive seismic experiment. The resulting migrated image and crustal thickness determinations confirm and define prior crustal thickness measurements based on the CD-ROM and Deep Probe datasets. The new results show a very strong lower crustal layer (LCL) with variable thickness beneath the Wyoming Province. In addition, I was able to show that it terminates at 42° latitude and provide a seismic tie between the CD-ROM and Deep Probe seismic experiments so they represent a continuous N-S transect extending from New Mexico into Alberta, Canada. This new tie is particularly important because it occurs close to a major tectonic boundary, the Cheyenne belt, between an Archean craton and a Proterozoic terrane. The controlled-source seismic dataset was analyzed with the aid of forward modeling and inversion to establish a two-dimensional velocity and interface model of the area. I have developed a picking strategy, which helps identify the seismic phases, and improves quality and quantity of the picks. In addition, I was able to pick and identify S-wave phases, which furthermore allowed me to establish an independent S-wave model, and hence the Poisson's and Vp/Vs ratios. The final velocity and interface model was compared to prior results, and the results were jointly interpreted with the receiver function results. Thanks to the integration of the controlled-source and receiver function

  5. The application of seismic risk-benefit analysis to land use planning in Taipei City.

    PubMed

    Hung, Hung-Chih; Chen, Liang-Chun

    2007-09-01

    In the developing countries of Asia local authorities rarely use risk analysis instruments as a decision-making support mechanism during planning and development procedures. The main purpose of this paper is to provide a methodology to enable planners to undertake such analyses. We illustrate a case study of seismic risk-benefit analysis for the city of Taipei, Taiwan, using available land use maps and surveys as well as a new tool developed by the National Science Council in Taiwan--the HAZ-Taiwan earthquake loss estimation system. We use three hypothetical earthquakes to estimate casualties and total and annualised direct economic losses, and to show their spatial distribution. We also characterise the distribution of vulnerability over the study area using cluster analysis. A risk-benefit ratio is calculated to express the levels of seismic risk attached to alternative land use plans. This paper suggests ways to perform earthquake risk evaluations and the authors intend to assist city planners to evaluate the appropriateness of their planning decisions.

  6. A Case Study of Geologic Hazards Affecting School Buildings: Evaluating Seismic Structural Vulnerability and Landslide Hazards at Schools in Aizawl, India

    NASA Astrophysics Data System (ADS)

    Perley, M. M.; Guo, J.

    2016-12-01

    India's National School Safety Program (NSSP) aims to assess all government schools in earthquake prone regions of the country. To supplement the Mizoram State Government's recent survey of 141 government schools, we screened an additional 16 private and 4 government schools for structural vulnerabilities due to earthquakes, as well as landslide hazards, in Mizoram's capital of Aizawl. We developed a geomorphologically derived landslide susceptibility matrix, which was cross-checked with Aizawl Municipal Corporation's landslide hazard map (provided by Lettis Consultants International), to determine the geologic hazards at each school. Our research indicates that only 7% of the 22 assessed school buildings are located within low landslide hazard zones; 64% of the school buildings, with approximately 9,500 students, are located within very high or high landslide hazard zones. Rapid Visual Screening (RVS) was used to determine the structural earthquake vulnerability of each school building. RVS is an initial vulnerability assessment procedure used to inventory and rank buildings that may be hazardous during an earthquake. Our study indicates that all of the 22 assessed school buildings have a damageability rating of Grade 3 or higher on the 5-grade EMS scale, suggesting a significant vulnerability and potential for damage in buildings, ranging from widespread cracking of columns and beam column joints to collapse. Additionally, 86% of the schools we visited had reinforced concrete buildings constructed before Aizawl's building regulations were passed in 2007, which can be assumed to lack appropriate seismic reinforcement. Using our findings, we will give recommendations to the Government of Mizoram to prevent unnecessary loss of life by minimizing each school's landslide risk and ensuring schools are earthquake-resistant.

  7. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    NASA Astrophysics Data System (ADS)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  8. Seismic performance analysis of Tendaho earth fill dam, Ethiopia.

    NASA Astrophysics Data System (ADS)

    Berhe, T.; Wu, W.

    2009-04-01

    The Tendaho dam is found in the Afar regional state, North Eastern part of Ethiopia. It is located within an area known as the ‘Tendaho Graben' ,which forms the center of Afar triangle, a low lying area of land where East African, Red sea and the Gulf of Eden Rift systems converge. The dam is an earthfill dam with a volume of about 4 Million cubic meters and with mixed clay core. The geological setting associated with the site of the dam, the geotechnical properties of the dam materials and seismicity of the region are reviewed. Based on this review, the foundation materials and dam body include some liquefiable granular soils. Moreover, the active East African Rift Valley fault, which can generate an earthquake of magnitude greater than 6, passes through the dam body. This valley is the primary seismic source contributing to the hazard at the Tendaho dam site. The availability of liquefiable materials beneath and within the dam body and the presence of the active fault crossing the dam site demand a thorough seismic analysis of the dam. The peak ground acceleration (PGA) is selected as a measure of ground motion severity. The PGA was selected according to the guidelines of the International Commission on Large Dams, ICOLD. Based on the criteria set by the ICOLD, the dam is analyzed for two different earthquake magnitudes, the Maximum Credible Earthquake (MCE) and the Operating Basis Earthquake (OBE). Numerical codes are useful tools to investigate the safety of dams in seismic prone areas. In this paper, FLAC3D numerical tool is used to investigate the performance of the dam under dynamic loading. Based on the numerical analysis, the seismic performance of the dam is investigated.

  9. A Bayesian Seismic Hazard Analysis for the city of Naples

    NASA Astrophysics Data System (ADS)

    Faenza, Licia; Pierdominici, Simona; Hainzl, Sebastian; Cinti, Francesca R.; Sandri, Laura; Selva, Jacopo; Tonini, Roberto; Perfetti, Paolo

    2016-04-01

    In the last years many studies have been focused on determination and definition of the seismic, volcanic and tsunamogenic hazard in the city of Naples. The reason is that the town of Naples with its neighboring area is one of the most densely populated places in Italy. In addition, the risk is increased also by the type and condition of buildings and monuments in the city. It is crucial therefore to assess which active faults in Naples and surrounding area could trigger an earthquake able to shake and damage the urban area. We collect data from the most reliable and complete databases of macroseismic intensity records (from 79 AD to present). For each seismic event an active tectonic structure has been associated. Furthermore a set of active faults, well-known from geological investigations, located around the study area that they could shake the city, not associated with any earthquake, has been taken into account for our studies. This geological framework is the starting point for our Bayesian seismic hazard analysis for the city of Naples. We show the feasibility of formulating the hazard assessment procedure to include the information of past earthquakes into the probabilistic seismic hazard analysis. This strategy allows on one hand to enlarge the information used in the evaluation of the hazard, from alternative models for the earthquake generation process to past shaking and on the other hand to explicitly account for all kinds of information and their uncertainties. The Bayesian scheme we propose is applied to evaluate the seismic hazard of Naples. We implement five different spatio-temporal models to parameterize the occurrence of earthquakes potentially dangerous for Naples. Subsequently we combine these hazard curves with ShakeMap of past earthquakes that have been felt in Naples. The results are posterior hazard assessment for three exposure times, e.g., 50, 10 and 5 years, in a dense grid that cover the municipality of Naples, considering bedrock soil

  10. A power flow based model for the analysis of vulnerability in power networks

    NASA Astrophysics Data System (ADS)

    Wang, Zhuoyang; Chen, Guo; Hill, David J.; Dong, Zhao Yang

    2016-10-01

    An innovative model which considers power flow, one of the most important characteristics in a power system, is proposed for the analysis of power grid vulnerability. Moreover, based on the complex network theory and the Max-Flow theorem, a new vulnerability index is presented to identify the vulnerable lines in a power grid. In addition, comparative simulations between the power flow based model and existing models are investigated on the IEEE 118-bus system. The simulation results demonstrate that the proposed model and the index are more effective in power grid vulnerability analysis.

  11. Probabilistic seismic demand analysis using advanced ground motion intensity measures

    USGS Publications Warehouse

    Tothong, P.; Luco, N.

    2007-01-01

    One of the objectives in performance-based earthquake engineering is to quantify the seismic reliability of a structure at a site. For that purpose, probabilistic seismic demand analysis (PSDA) is used as a tool to estimate the mean annual frequency of exceeding a specified value of a structural demand parameter (e.g. interstorey drift). This paper compares and contrasts the use, in PSDA, of certain advanced scalar versus vector and conventional scalar ground motion intensity measures (IMs). One of the benefits of using a well-chosen IM is that more accurate evaluations of seismic performance are achieved without the need to perform detailed ground motion record selection for the nonlinear dynamic structural analyses involved in PSDA (e.g. record selection with respect to seismic parameters such as earthquake magnitude, source-to-site distance, and ground motion epsilon). For structural demands that are dominated by a first mode of vibration, using inelastic spectral displacement (Sdi) can be advantageous relative to the conventionally used elastic spectral acceleration (Sa) and the vector IM consisting of Sa and epsilon (??). This paper demonstrates that this is true for ordinary and for near-source pulse-like earthquake records. The latter ground motions cannot be adequately characterized by either Sa alone or the vector of Sa and ??. For structural demands with significant higher-mode contributions (under either of the two types of ground motions), even Sdi (alone) is not sufficient, so an advanced scalar IM that additionally incorporates higher modes is used.

  12. A Decision Analysis Tool for Climate Impacts, Adaptations, and Vulnerabilities

    SciTech Connect

    Omitaomu, Olufemi A; Parish, Esther S; Nugent, Philip J

    2016-01-01

    Climate change related extreme events (such as flooding, storms, and drought) are already impacting millions of people globally at a cost of billions of dollars annually. Hence, there are urgent needs for urban areas to develop adaptation strategies that will alleviate the impacts of these extreme events. However, lack of appropriate decision support tools that match local applications is limiting local planning efforts. In this paper, we present a quantitative analysis and optimization system with customized decision support modules built on geographic information system (GIS) platform to bridge this gap. This platform is called Urban Climate Adaptation Tool (Urban-CAT). For all Urban-CAT models, we divide a city into a grid with tens of thousands of cells; then compute a list of metrics for each cell from the GIS data. These metrics are used as independent variables to predict climate impacts, compute vulnerability score, and evaluate adaptation options. Overall, the Urban-CAT system has three layers: data layer (that contains spatial data, socio-economic and environmental data, and analytic data), middle layer (that handles data processing, model management, and GIS operation), and application layer (that provides climate impacts forecast, adaptation optimization, and site evaluation). The Urban-CAT platform can guide city and county governments in identifying and planning for effective climate change adaptation strategies.

  13. Analysis and Defense of Vulnerabilities in Binary Code

    DTIC Science & Technology

    2008-09-29

    additional web browser vulnerabilities. The client-end web - browser vulnerabilities come from the Month of Browser Bugs (MOBB) website [7]. These bugs...program written as part of a web page. Modern web - browsers allow for various scripting languages, such as JScript, JavaScript, and VBScript. Scripting...languages extend basic HTML with the ability to call native methods on the browser’s computer, e.g., ActiveX controls. When the web - browser renders a

  14. Noise analysis of the seismic system employed in the northern and southern California seismic nets

    USGS Publications Warehouse

    Eaton, J.P.

    1984-01-01

    The seismic networks have been designed and operated to support recording on Develocorders (less than 40db dynamic range) and analog magnetic tape (about 50 db dynamic range). The principal analysis of the records has been based on Develocorder films; and background earth noise levels have been adjusted to be about 1 to 2 mm p-p on the film readers. Since the traces are separated by only 10 to 12 mm on the reader screen, they become hopelessly tangled when signal amplitudes on several adjacent traces exceed 10 to 20 mm p-p. Thus, the background noise level is hardly more than 20 db below the level of largest readable signals. The situation is somewhat better on tape playbacks, but the high level of background noise set to accomodate processing from film records effectively limits the range of maximum-signal to background-earth-noise on high gain channels to a little more than 30 db. Introduction of the PDP 11/44 seismic data acquisition system has increased the potential dynamic range of recorded network signals to more than 60 db. To make use of this increased dynamic range we must evaluate the characteristics and performance of the seismic system. In particular, we must determine whether the electronic noise in the system is or can be made sufficiently low so that background earth noise levels can be lowered significantly to take advantage of the increased dynamic range of the digital recording system. To come to grips with the complex problem of system noise, we have carried out a number of measurements and experiments to evaluate critical components of the system as well as to determine the noise characteristics of the system as a whole.

  15. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  16. Seismic and hydroacoustic analysis relevant to MH370

    SciTech Connect

    Stead, Richard J.

    2014-07-03

    The vicinity of the Indian Ocean is searched for open and readily available seismic and/or hydroacoustic stations that might have recorded a possible impact of MH370 with the ocean surface. Only three stations are identified: the IMS hydrophone arrays H01 and H08, and the Geoscope seismic station AIS. Analysis of the data from these stations shows an interesting arrival on H01 that has some interference from an Antarctic ice event, large amplitude repeating signals at H08 that obscure any possible arrivals, and large amplitude chaotic noise at AIS precludes any analysis at higher frequencies of interest. The results are therefore rather inconclusive but may point to a more southerly impact location within the overall Indian Ocean search region. The results would be more useful if they can be combined with any other data that are not readily available.

  17. CORSSA: Community Online Resource for Statistical Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Zechar, J. D.; Hardebeck, J. L.; Michael, A. J.; Naylor, M.; Steacy, S.; Wiemer, S.; Zhuang, J.

    2011-12-01

    Statistical seismology is critical to the understanding of seismicity, the evaluation of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology-especially to those aspects with great impact on public policy-statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA, www.corssa.org). We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each will contain between four and eight articles. CORSSA now includes seven articles with an additional six in draft form along with forums for discussion, a glossary, and news about upcoming meetings, special issues, and recent papers. Each article is peer-reviewed and presents a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. We have also begun curating a collection of statistical seismology software packages.

  18. Building the Community Online Resource for Statistical Seismicity Analysis (CORSSA)

    NASA Astrophysics Data System (ADS)

    Michael, A. J.; Wiemer, S.; Zechar, J. D.; Hardebeck, J. L.; Naylor, M.; Zhuang, J.; Steacy, S.; Corssa Executive Committee

    2010-12-01

    Statistical seismology is critical to the understanding of seismicity, the testing of proposed earthquake prediction and forecasting methods, and the assessment of seismic hazard. Unfortunately, despite its importance to seismology - especially to those aspects with great impact on public policy - statistical seismology is mostly ignored in the education of seismologists, and there is no central repository for the existing open-source software tools. To remedy these deficiencies, and with the broader goal to enhance the quality of statistical seismology research, we have begun building the Community Online Resource for Statistical Seismicity Analysis (CORSSA). CORSSA is a web-based educational platform that is authoritative, up-to-date, prominent, and user-friendly. We anticipate that the users of CORSSA will range from beginning graduate students to experienced researchers. More than 20 scientists from around the world met for a week in Zurich in May 2010 to kick-start the creation of CORSSA: the format and initial table of contents were defined; a governing structure was organized; and workshop participants began drafting articles. CORSSA materials are organized with respect to six themes, each containing between four and eight articles. The CORSSA web page, www.corssa.org, officially unveiled on September 6, 2010, debuts with an initial set of approximately 10 to 15 articles available online for viewing and commenting with additional articles to be added over the coming months. Each article will be peer-reviewed and will present a balanced discussion, including illustrative examples and code snippets. Topics in the initial set of articles will include: introductions to both CORSSA and statistical seismology, basic statistical tests and their role in seismology; understanding seismicity catalogs and their problems; basic techniques for modeling seismicity; and methods for testing earthquake predictability hypotheses. A special article will compare and review

  19. Four-dimensional seismic analysis of the Hibernia oil field, Grand Banks, Canada

    NASA Astrophysics Data System (ADS)

    Wright, Richard James

    2004-12-01

    The seismic reflection method, traditionally a geologic structural imaging tool, is increasingly being utilized for petroleum reservoir monitoring purposes. Time-lapse, or four dimensional (4D) seismic reservoir monitoring is the process by which repeated 3D seismic surveys are acquired over a common area during the production of a petroleum reservoir in an effort to spatially image production related changes. While if successful, this seismic method can have a significant impact on an oil field's development plan, the sometimes subtle nature of the 4D seismic signals restricts the universal application of 4D seismic methods in all reservoirs and operating environments. To examine the potential use of 4D seismic on Canada's Grand Banks, this thesis conducts a 4D seismic analysis of the Hibernia oil field---the first example of 4D seismic technology on the Grand Banks. Due to a challenging environment (seismic and reservoir) at Hibernia for 4D seismic success, rock physics modeling predicts a subtle 4D seismic response for areas of both water and gas injection. To equalize the 4D seismic datasets, specialized poststack cross equalization including a volume event warping process is applied to two 3D post stack seismic datasets from the Hibernia oil field, a pre-production "legacy" survey acquired in 1991, and a 2001 survey. The cross equalization processing improves the repeatability of non-reservoir events fieldwide and enhances reservoir anomalies in some areas of the field. While the data contains a fair degree of noise, 4D seismic anomalies above the noise level can be imaged in areas of both water and gas injection. Through interpretation, some of these anomalies are shown to be consistent with modeled responses to water and gas injection. In addition, there is evidence that some of the seismic anomalies may be due to pore pressure changes in the reservoir. The results of the Hibernia 4D seismic analysis are then used as background for a feasibility analysis for

  20. EBR-2 (Experimental Breeder Reactor-2) containment seismic analysis

    SciTech Connect

    Gale, J.G.; Lehto, W.K.

    1990-01-01

    The Experimental Breeder Reactor-2 (EBR-2) is a liquid metal reactor located at the Argonne National Laboratory near Idaho Falls, Idaho. At the time the EBR-2 was designed and constructed, there were no engineering society or federal guide lines specifically directed toward the seismic design of reactor containment structures; hence, static analysis techniques were used in the design. With the increased focus on safety of reactor and fuel reprocessing facilities, Argonne has initiated a program to analyze its existing facilities for seismic integrity using current Department of Energy guidelines and industry consensus standards. A seismic analysis of the EBR-2 containment building has been performed using finite-element analysis techniques. The containment building is essentially a vertical right cylindrical steel shell with heads on both ends. The structure is unique in that the interior of the steel shell is lined with reinforced concrete. The actual containment function of the building is served by the steel shell; whereas the function of the concrete liner is to serve as a missile shield and a thermal insulating shield to protect the steel containment shell from internally generated missiles and fires. Model development and structural evaluation of the EBR-2 containment building are discussed in this paper. 7 refs., 8 figs.

  1. Seismic Noise Analysis and Reduction through Utilization of Collocated Seismic and Atmospheric Sensors at the GRO Chile Seismic Network

    NASA Astrophysics Data System (ADS)

    Farrell, M. E.; Russo, R. M.

    2013-12-01

    The installation of Earthscope Transportable Array-style geophysical observatories in Chile expands open data seismic recording capabilities in the southern hemisphere by nearly 30%, and has nearly tripled the number of seismic stations providing freely-available data in southern South America. Through the use of collocated seismic and atmospheric sensors at these stations we are able to analyze how local atmospheric conditions generate seismic noise, which can degrade data in seismic frequency bands at stations in the ';roaring forties' (S latitudes). Seismic vaults that are climate-controlled and insulated from the local environment are now employed throughout the world in an attempt to isolate seismometers from as many noise sources as possible. However, this is an expensive solution that is neither practical nor possible for all seismic deployments; and also, the increasing number and scope of temporary seismic deployments has resulted in the collection and archiving of terabytes of seismic data that is affected to some degree by natural seismic noise sources such as wind and atmospheric pressure changes. Changing air pressure can result in a depression and subsequent rebound of Earth's surface - which generates low frequency noise in seismic frequency bands - and even moderate winds can apply enough force to ground-coupled structures or to the surface above the seismometers themselves, resulting in significant noise. The 10 stations of the permanent Geophysical Reporting Observatories (GRO Chile), jointly installed during 2011-12 by IRIS and the Chilean Servicio Sismológico, include instrumentation in addition to the standard three seismic components. These stations, spaced approximately 300 km apart along the length of the country, continuously record a variety of atmospheric data including infrasound, air pressure, wind speed, and wind direction. The collocated seismic and atmospheric sensors at each station allow us to analyze both datasets together, to

  2. The concept of 'vulnerability' in research ethics: an in-depth analysis of policies and guidelines.

    PubMed

    Bracken-Roche, Dearbhail; Bell, Emily; Macdonald, Mary Ellen; Racine, Eric

    2017-02-07

    The concept of vulnerability has held a central place in research ethics guidance since its introduction in the United States Belmont Report in 1979. It signals mindfulness for researchers and research ethics boards to the possibility that some participants may be at higher risk of harm or wrong. Despite its important intended purpose and widespread use, there is considerable disagreement in the scholarly literature about the meaning and delineation of vulnerability, stemming from a perceived lack of guidance within research ethics standards. The aim of this study was to assess the concept of vulnerability as it is employed in major national and international research ethics policies and guidelines. We conducted an in-depth analysis of 11 (five national and six international) research ethics policies and guidelines, exploring their discussions of the definition, application, normative justification and implications of vulnerability. Few policies and guidelines explicitly defined vulnerability, instead relying on implicit assumptions and the delineation of vulnerable groups and sources of vulnerability. On the whole, we found considerable richness in the content on vulnerability across policies, but note that this relies heavily on the structure imposed on the data through our analysis. Our results underscore a need for policymakers to revisit the guidance on vulnerability in research ethics, and we propose that a process of stakeholder engagement would well-support this effort.

  3. Multidimensional analysis and probabilistic model of volcanic and seismic activities

    NASA Astrophysics Data System (ADS)

    Fedorov, V.

    2009-04-01

    .I. Gushchenko, 1979) and seismological (database of USGS/NEIC Significant Worldwide Earthquakes, 2150 B.C.- 1994 A.D.) information which displays dynamics of endogenic relief-forming processes over a period of 1900 to 1994. In the course of the analysis, a substitution of calendar variable by a corresponding astronomical one has been performed and the epoch superposition method was applied. In essence, the method consists in that the massifs of information on volcanic eruptions (over a period of 1900 to 1977) and seismic events (1900-1994) are differentiated with respect to value of astronomical parameters which correspond to the calendar dates of the known eruptions and earthquakes, regardless of the calendar year. The obtained spectra of volcanic eruptions and violent earthquake distribution in the fields of the Earth orbital movement parameters were used as a basis for calculation of frequency spectra and diurnal probability of volcanic and seismic activity. The objective of the proposed investigations is a probabilistic model development of the volcanic and seismic events, as well as GIS designing for monitoring and forecast of volcanic and seismic activities. In accordance with the stated objective, three probability parameters have been found in the course of preliminary studies; they form the basis for GIS-monitoring and forecast development. 1. A multidimensional analysis of volcanic eruption and earthquakes (of magnitude 7) have been performed in terms of the Earth orbital movement. Probability characteristics of volcanism and seismicity have been defined for the Earth as a whole. Time intervals have been identified with a diurnal probability twice as great as the mean value. Diurnal probability of volcanic and seismic events has been calculated up to 2020. 2. A regularity is found in duration of dormant (repose) periods has been established. A relationship has been found between the distribution of the repose period probability density and duration of the period. 3

  4. SeismicWaveTool: Continuous and discrete wavelet analysis and filtering for multichannel seismic data

    NASA Astrophysics Data System (ADS)

    Galiana-Merino, J. J.; Rosa-Herranz, J. L.; Rosa-Cintas, S.; Martinez-Espla, J. J.

    2013-01-01

    A MATLAB-based computer code has been developed for the simultaneous wavelet analysis and filtering of multichannel seismic data. The considered time-frequency transforms include the continuous wavelet transform, the discrete wavelet transform and the discrete wavelet packet transform. The developed approaches provide a fast and precise time-frequency examination of the seismograms at different frequency bands. Moreover, filtering methods for noise, transients or even baseline removal, are implemented. The primary motivation is to support seismologists with a user-friendly and fast program for the wavelet analysis, providing practical and understandable results. Program summaryProgram title: SeismicWaveTool Catalogue identifier: AENG_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AENG_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC license, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 611072 No. of bytes in distributed program, including test data, etc.: 14688355 Distribution format: tar.gz Programming language: MATLAB (MathWorks Inc.) version 7.8.0.347 (R2009a) or higher. Wavelet Toolbox is required. Computer: Developed on a MacBook Pro. Tested on Mac and PC. No computer-specific optimization was performed. Operating system: Any supporting MATLAB (MathWorks Inc.) v7.8.0.347 (R2009a) or higher. Tested on Mac OS X 10.6.8, Windows XP and Vista. Classification: 13. Nature of problem: Numerous research works have developed a great number of free or commercial wavelet based software, which provide specific solutions for the analysis of seismic data. On the other hand, standard toolboxes, packages or libraries, such as the MathWorks' Wavelet Toolbox for MATLAB, offer command line functions and interfaces for the wavelet analysis of one-component signals. Thus, software usually is focused on very specific problems

  5. Mapping Upper Mantle Seismic Discontinuities Using Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Gu, Y. J.; Dokht, R.; Sacchi, M. D.

    2015-12-01

    Seismic discontinuities are fundamental to the understanding of mantle composition and dynamics. Their depth and impedance are generally determined using secondary seismic phases, most commonly SS precursors and P-to-S converted waves. However, the analysis and interpretation using these approaches often suffer from incomplete data coverage, high noise levels and interfering seismic phases, especially near tectonically complex regions such as subduction zones and continental margins. To overcome these pitfalls, we apply Singular Spectrum Analysis (SSA) to remove random noise, reconstruct missing traces and enhance the robustness of SS precursors and P-to-S conversions from seismic discontinuities. Our method takes advantage of the predictability of time series in frequency-space domain and performs a rank reduction using a singular value decomposition of the trajectory matrix. We apply SSA to synthetic record sections as well as observations of 1) SS precursors beneath the northwestern Pacific subduction zones, and 2) P-to-S converted waves from the Western Canada Sedimentary Basin (WCSB). In comparison with raw or interpolated data, the SSA enhanced reflectivity maps show a greater resolution and a stronger negative correlation between the depths of the 410 and 660 km discontinuities. These effects can be attributed to the suppression of incoherent noise, which tends to reduce the signal amplitude during normal averaging procedures, through rank reduction and the emphasis of principle singular values. Our new results suggest a more laterally coherent 520 km reflection in the western Pacific regions. Similar improvements in data imaging are achieved in western Canada, where strong lateral variations in discontinuity topography are observed in the craton-Cordillera boundary zone. Improvements from SSA relative to conventional approaches are most notable in under-sampled regions.

  6. Understanding North Texas Seismicity: A Joint Analysis of Seismic Data and 3D Pore Pressure Modeling

    NASA Astrophysics Data System (ADS)

    DeShon, H. R.; Hornbach, M. J.; Ellsworth, W. L.; Oldham, H. R.; Hayward, C.; Stump, B. W.; Frohlich, C.; Olson, J. E.; Luetgert, J. H.

    2014-12-01

    In November 2013, a series of earthquakes began along a mapped ancient fault system near Azle, Texas. The Azle events are the third felt earthquake sequence in the Fort Worth (Barnett Shale) Basin since 2008, and several production and injection wells in the area are drilled to depths near the recent seismic activity. Understanding if and/or how injection and removal of fluids in the crystalline crust reactivates faults have important implications for seismology, the energy industry, and society. We assessed whether the Azle earthquakes were induced using a joint analysis of the earthquake data, subsurface geology and fault structure, and 3D pore pressure modeling. Using a 12-station temporary seismic deployment, we have recorded and located >300 events large enough to be recorded on multiple stations and 1000s of events during periods of swarm activity. High-resolution locations and focal mechanisms indicate that events occurred on NE-SW trending, steeply dipping normal faults associated with the southern end of the Newark East Fault Zone with hypocenters between 2-8 km depth. We considered multiple causes that might have changed stress along this system. Earthquakes resulting from natural processes, though perhaps unlikely in this historically inactive region, can be neither ruled out nor confirmed due to lack of information on the natural stress state of these faults. Analysis of lake and groundwater variations near Azle showed that no significant stress changes occurred prior to or during the earthquake sequence. In contrast, analysis of pore-pressure models shows that the combination of formation water production and wastewater injection near the fault could have caused pressure increases that induced earthquakes on near-critically stressed faults.

  7. Real Option Cost Vulnerability Analysis of Electrical Infrastructure

    NASA Astrophysics Data System (ADS)

    Prime, Thomas; Knight, Phil

    2015-04-01

    Critical infrastructure such as electricity substations are vulnerable to various geo-hazards that arise from climate change. These geo-hazards range from increased vegetation growth to increased temperatures and flood inundation. Of all the identified geo-hazards, coastal flooding has the greatest impact, but to date has had a low probability of occurring. However, in the face of climate change, coastal flooding is likely to occur more often due to extreme water levels being experienced more frequently due to sea-level rise (SLR). Knowing what impact coastal flooding will have now and in the future on critical infrastructure such as electrical substations is important for long-term management. Using a flood inundation model, present day and future flood events have been simulated, from 1 in 1 year events up to 1 in 10,000 year events. The modelling makes an integrated assessment of impact by using sea-level and surge to simulate a storm tide. The geographical area the model covers is part of the Northwest UK coastline with a range of urban and rural areas. The ensemble of flood maps generated allows the identification of critical infrastructure exposed to coastal flooding. Vulnerability has be assessed using an Estimated Annual Damage (EAD) value. Sampling SLR annual probability distributions produces a projected "pathway" for SLR up to 2100. EAD is then calculated using a relationship derived from the flood model. Repeating the sampling process allows a distribution of EAD up to 2100 to be produced. These values are discounted to present day values using an appropriate discount rate. If the cost of building and maintain defences is also removed from this a Net Present Value (NPV) of building the defences can be calculated. This distribution of NPV can be used as part of a cost modelling process involving Real Options, A real option is the right but not obligation to undertake investment decisions. In terms of investment in critical infrastructure resilience this

  8. Cluster Computing For Real Time Seismic Array Analysis.

    NASA Astrophysics Data System (ADS)

    Martini, M.; Giudicepietro, F.

    A seismic array is an instrument composed by a dense distribution of seismic sen- sors that allow to measure the directional properties of the wavefield (slowness or wavenumber vector) radiated by a seismic source. Over the last years arrays have been widely used in different fields of seismological researches. In particular they are applied in the investigation of seismic sources on volcanoes where they can be suc- cessfully used for studying the volcanic microtremor and long period events which are critical for getting information on the volcanic systems evolution. For this reason arrays could be usefully employed for the volcanoes monitoring, however the huge amount of data produced by this type of instruments and the processing techniques which are quite time consuming limited their potentiality for this application. In order to favor a direct application of arrays techniques to continuous volcano monitoring we designed and built a small PC cluster able to near real time computing the kinematics properties of the wavefield (slowness or wavenumber vector) produced by local seis- mic source. The cluster is composed of 8 Intel Pentium-III bi-processors PC working at 550 MHz, and has 4 Gigabytes of RAM memory. It runs under Linux operating system. The developed analysis software package is based on the Multiple SIgnal Classification (MUSIC) algorithm and is written in Fortran. The message-passing part is based upon the LAM programming environment package, an open-source imple- mentation of the Message Passing Interface (MPI). The developed software system includes modules devote to receiving date by internet and graphical applications for the continuous displaying of the processing results. The system has been tested with a data set collected during a seismic experiment conducted on Etna in 1999 when two dense seismic arrays have been deployed on the northeast and the southeast flanks of this volcano. A real time continuous acquisition system has been simulated by

  9. Source-Type Identification Analysis Using Regional Seismic Moment Tensors

    NASA Astrophysics Data System (ADS)

    Chiang, A.; Dreger, D. S.; Ford, S. R.; Walter, W. R.

    2012-12-01

    Waveform inversion to determine the seismic moment tensor is a standard approach in determining the source mechanism of natural and manmade seismicity, and may be used to identify, or discriminate different types of seismic sources. The successful applications of the regional moment tensor method at the Nevada Test Site (NTS) and the 2006 and 2009 North Korean nuclear tests (Ford et al., 2009a, 2009b, 2010) show that the method is robust and capable for source-type discrimination at regional distances. The well-separated populations of explosions, earthquakes and collapses on a Hudson et al., (1989) source-type diagram enables source-type discrimination; however the question remains whether or not the separation of events is universal in other regions, where we have limited station coverage and knowledge of Earth structure. Ford et al., (2012) have shown that combining regional waveform data and P-wave first motions removes the CLVD-isotropic tradeoff and uniquely discriminating the 2009 North Korean test as an explosion. Therefore, including additional constraints from regional and teleseismic P-wave first motions enables source-type discrimination at regions with limited station coverage. We present moment tensor analysis of earthquakes and explosions (M6) from Lop Nor and Semipalatinsk test sites for station paths crossing Kazakhstan and Western China. We also present analyses of smaller events from industrial sites. In these sparse coverage situations we combine regional long-period waveforms, and high-frequency P-wave polarity from the same stations, as well as from teleseismic arrays to constrain the source type. Discrimination capability with respect to velocity model and station coverage is examined, and additionally we investigate the velocity model dependence of vanishing free-surface traction effects on seismic moment tensor inversion of shallow sources and recovery of explosive scalar moment. Our synthetic data tests indicate that biases in scalar

  10. Seismic fragility analysis of highway bridges considering multi-dimensional performance limit state

    NASA Astrophysics Data System (ADS)

    Wang, Qi'ang; Wu, Ziyan; Liu, Shukui

    2012-03-01

    Fragility analysis for highway bridges has become increasingly important in the risk assessment of highway transportation networks exposed to seismic hazards. This study introduces a methodology to calculate fragility that considers multi-dimensional performance limit state parameters and makes a first attempt to develop fragility curves for a multispan continuous (MSC) concrete girder bridge considering two performance limit state parameters: column ductility and transverse deformation in the abutments. The main purpose of this paper is to show that the performance limit states, which are compared with the seismic response parameters in the calculation of fragility, should be properly modeled as randomly interdependent variables instead of deterministic quantities. The sensitivity of fragility curves is also investigated when the dependency between the limit states is different. The results indicate that the proposed method can be used to describe the vulnerable behavior of bridges which are sensitive to multiple response parameters and that the fragility information generated by this method will be more reliable and likely to be implemented into transportation network loss estimation.

  11. Vulnerability analysis and critical areas identification of the power systems under terrorist attacks

    NASA Astrophysics Data System (ADS)

    Wang, Shuliang; Zhang, Jianhua; Zhao, Mingwei; Min, Xu

    2017-05-01

    This paper takes central China power grid (CCPG) as an example, and analyzes the vulnerability of the power systems under terrorist attacks. To simulate the intelligence of terrorist attacks, a method of critical attack area identification according to community structures is introduced. Meanwhile, three types of vulnerability models and the corresponding vulnerability metrics are given for comparative analysis. On this basis, influence of terrorist attacks on different critical areas is studied. Identifying the vulnerability of different critical areas will be conducted. At the same time, vulnerabilities of critical areas under different tolerance parameters and different vulnerability models are acquired and compared. Results show that only a few number of vertex disruptions may cause some critical areas collapse completely, they can generate great performance losses the whole systems. Further more, the variation of vulnerability values under different scenarios is very large. Critical areas which can cause greater damage under terrorist attacks should be given priority of protection to reduce vulnerability. The proposed method can be applied to analyze the vulnerability of other infrastructure systems, they can help decision makers search mitigation action and optimum protection strategy.

  12. Classification of aquifer vulnerability using K-means cluster analysis

    NASA Astrophysics Data System (ADS)

    Javadi, S.; Hashemy, S. M.; Mohammadi, K.; Howard, K. W. F.; Neshat, A.

    2017-06-01

    Groundwater is one of the main sources of drinking and agricultural water in arid and semi-arid regions but is becoming increasingly threatened by contamination. Vulnerability mapping has been used for many years as an effective tool for assessing the potential for aquifer pollution and the most common method of intrinsic vulnerability assessment is DRASTIC (Depth to water table, net Recharge, Aquifer media, Soil media, Topography, Impact of vadose zone and hydraulic Conductivity). An underlying problem with the DRASTIC approach relates to the subjectivity involved in selecting relative weightings for each of the DRASTIC factors and assigning rating values to ranges or media types within each factor. In this study, a clustering technique is introduced that removes some of the subjectivity associated with the indexing method. It creates a vulnerability map that does not rely on fixed weights and ratings and, thereby provides a more objective representation of the system's physical characteristics. This methodology was applied to an aquifer in Iran and compared with the standard DRASTIC approach using the water quality parameters nitrate, chloride and total dissolved solids (TDS) as surrogate indicators of aquifer vulnerability. The proposed method required only four of DRASTIC's seven factors - depth to groundwater, hydraulic conductivity, recharge value and the nature of the vadose zone, to produce a superior result. For nitrate, chloride, and TDS, respectively, the clustering approach delivered Pearson correlation coefficients that were 15, 22 and 5 percentage points higher than those obtained for the DRASTIC method.

  13. Application of Visual Attention in Seismic Attribute Analysis

    NASA Astrophysics Data System (ADS)

    He, M.; Gu, H.; Wang, F.

    2016-12-01

    It has been proved that seismic attributes can be used to predict reservoir. The joint of multi-attribute and geological statistics, data mining, artificial intelligence, further promote the development of the seismic attribute analysis. However, the existing methods tend to have multiple solutions and insufficient generalization ability, which is mainly due to the complex relationship between seismic data and geological information, and undoubtedly own partly to the methods applied. Visual attention is a mechanism model of the human visual system which can concentrate on a few significant visual objects rapidly, even in a mixed scene. Actually, the model qualify good ability of target detection and recognition. In our study, the targets to be predicted are treated as visual objects, and an object representation based on well data is made in the attribute dimensions. Then in the same attribute space, the representation is served as a criterion to search the potential targets outside the wells. This method need not predict properties by building up a complicated relation between attributes and reservoir properties, but with reference to the standard determined before. So it has pretty good generalization ability, and the problem of multiple solutions can be weakened by defining the threshold of similarity.

  14. Seismic vertical array analysis for phase decomposition

    NASA Astrophysics Data System (ADS)

    Yoshida, Kunikazu; Sasatani, Tsutomu

    2008-08-01

    We propose a vertical array analysis method that decomposes complex seismograms into body and surface wave time histories by using a velocity structure at the vertical array site. We assume that the vertical array records are the sum of vertically incident plane P and S waves, and laterally incident Love and Rayleigh waves. Each phase at the surface is related to that at a certain depth by the transfer function in the frequency domain; the transfer function is obtained by Haskell's matrix method, assuming a 1-D velocity structure. Decomposed P, S and surface waves at the surface are estimated from the vertical array records and the transfer functions by using a least-squares method in the frequency domain; their time histories are obtained by the inverse Fourier transform. We carried out numerical tests of this method based on synthetic vertical array records consisting of vertically incident plane P and S waves and laterally incident plane Love and Rayleigh waves. Perfect results of the decomposed P, S, Love and Rayleigh waves were obtained for synthetic records without noise. A test of the synthetic records in which a small amount of white noise was added yielded a reasonable result for the decomposed P, S and surface waves. We applied this method to real vertical array records from the Ashigara valley, a moderate-sized sedimentary valley. The array records from two earthquakes occurring at depths of 123 and 148 km near the array (epicentral distance of about 31 km) exhibited long-duration later phases. The analysis showed that duration of the decomposed S waves was a few seconds and that the decomposed surface waves appeared a few seconds after the direct S-wave arrival and had very long duration. This result indicated that the long-duration later phases were generated not by multireflected S waves, but by basin-induced surface waves.

  15. Spectrum analysis techniques for personnel detection using seismic sensors

    NASA Astrophysics Data System (ADS)

    Houston, Kenneth M.; McGaffigan, Daniel P.

    2003-09-01

    There is a general need for improved detection range and false alarm performance for seismic sensors used for personnel detection. In this paper we describe a novel footstep detection algorithm which was developed and run on seismic footstep data collected at the Aberdeen Proving Ground in December 2000. The initial focus was an assessment of achievable detection range. The conventional approach to footstep detection is to detect transients corresponding to individual footfalls. We feel this is an error-prone approach. Because many real-world signals unrelated to human locomotion look like transients, transient-based footstep detection will inevitably either suffer from high false alarm rates or will be insensitive. Instead, we examined the use of spectrum analysis on envelope-detected seismic signals and have found the general method to be quite promising, not only for detection, but also for discrimination against other types of seismic sources. In particular, gait patterns and their corresponding signatures may help discriminate between human intruders and animals. In the APG data set, mean detection ranges of 64 meters (at PD=50%) were observed for normal walking, significantly improving on ranges previously reported. For running, mean detection ranges of 84 meters were observed. However, stealthy walking (creeping) remains a considerable problem. Even at short ranges (10 meters), in some cases the detection rate was less than 50%. In future efforts, additional data sets for a range of geologic and environmental conditions should be acquired and analyzed. Improvements to the detection algorithms are possible, including estimation of direction of travel and the number of intruders.

  16. Comparative seismic evaluation between numerical analysis and Italian guidelines on cultural heritage applied to the case study of a masonry building compound

    NASA Astrophysics Data System (ADS)

    Formisano, Antonio; Chiumiento, Giovanni; Fabbrocino, Francesco; Landolfo, Raffaele

    2017-07-01

    The general objective of the work is to draw attention to the issue of seismic vulnerability analysis of masonry building compounds, which characterise most of the Italian historic towns. The study is based on the analysis of an aggregated construction falling in the town of Arsita (Teramo, Italy) damaged after the 2009 L'Aquila earthquake. A comparison between the seismic verifications carried out by using the 3Muri commercial software and those deriving from the application of the Italian Guidelines on Cultural Heritage has been performed. The comparison has shown that Guidelines provide results on the safe side in predicting the seismic behaviour of the building compound under study. Further analyses should be performed aiming at suggesting some modifications of the used simplified calculation method to better interpret the behaviour of building compounds under earthquake.

  17. Seismic analysis of reactor exhaust-air Filter Compartment

    SciTech Connect

    Gong, C.; Funderburk, E.L.; Jerrel, J.W.; Vashi, K.M.

    1991-12-31

    This paper presents the results of a scoping analysis for assessment of seismic adequacy of a Filter Compartments (FC) that is part of an Airborne Activity Confinement System (AACS) in K, L, and P Reactors at the Savannah River Site (SRS). For an expeditious assessment and to increase the possibility of showing the adequacy of the FC, the finite element model incorporated certain conceptual reinforcing modifications suggested by a previous study. The model also set the vertical displacements at zero at the interface between the FC and the rail dolly, upon which the FC rests by gravity. In addition, the rail-dolly was assumed to be rigid and rigidly attached to the rails. The analysis was performed using the dynamic modal superposition response spectra capability of the ABAQUS computer code. Certain modelling approximations and linearized representation of boundary conditions were employed for utilization of the code and the selected analysis capability. The analysis results showed that the FC stresses and deformations were within the yield limit and that the structural integrity of the FC and the operability of the filters can be preserved as required for the defined seismic event consistent with the linearization assumptions, modelling simplifications, and incorporation of the conceptual reinforcing modifications. However, the rail-dolly rigidity, the FC hold-down to the rails must be ensured for this scoping analysis to be valid. 2 refs.

  18. Seismic analysis of reactor exhaust-air Filter Compartment

    SciTech Connect

    Gong, C.; Funderburk, E.L.; Jerrel, J.W.; Vashi, K.M.

    1991-01-01

    This paper presents the results of a scoping analysis for assessment of seismic adequacy of a Filter Compartments (FC) that is part of an Airborne Activity Confinement System (AACS) in K, L, and P Reactors at the Savannah River Site (SRS). For an expeditious assessment and to increase the possibility of showing the adequacy of the FC, the finite element model incorporated certain conceptual reinforcing modifications suggested by a previous study. The model also set the vertical displacements at zero at the interface between the FC and the rail dolly, upon which the FC rests by gravity. In addition, the rail-dolly was assumed to be rigid and rigidly attached to the rails. The analysis was performed using the dynamic modal superposition response spectra capability of the ABAQUS computer code. Certain modelling approximations and linearized representation of boundary conditions were employed for utilization of the code and the selected analysis capability. The analysis results showed that the FC stresses and deformations were within the yield limit and that the structural integrity of the FC and the operability of the filters can be preserved as required for the defined seismic event consistent with the linearization assumptions, modelling simplifications, and incorporation of the conceptual reinforcing modifications. However, the rail-dolly rigidity, the FC hold-down to the rails must be ensured for this scoping analysis to be valid. 2 refs.

  19. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES.

    SciTech Connect

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. .

    2005-07-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice.

  20. Seismic Fragility Analysis of a Degraded Condensate Storage Tank

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C.; Choun, Y-S.; Kim, M.K.; Choi, I-K.

    2011-05-16

    The Korea Atomic Energy Research Institute (KAERI) and Brookhaven National Laboratory are conducting a collaborative research project to develop seismic capability evaluation technology for degraded structures and components in nuclear power plants (NPPs). One of the goals of this collaboration endeavor is to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The essential part of this collaboration is aimed at achieving a better understanding of the effects of aging on the performance of SSCs and ultimately on the safety of NPPs. A recent search of the degradation occurrences of structures and passive components (SPCs) showed that the rate of aging related degradation in NPPs was not significantly large but increasing, as the plants get older. The slow but increasing rate of degradation of SPCs can potentially affect the safety of the older plants and become an important factor in decision making in the current trend of extending the operating license period of the plants (e.g., in the U.S. from 40 years to 60 years, and even potentially to 80 years). The condition and performance of major aged NPP structures such as the containment contributes to the life span of a plant. A frequent misconception of such low degradation rate of SPCs is that such degradation may not pose significant risk to plant safety. However, under low probability high consequence initiating events, such as large earthquakes, SPCs that have slowly degraded over many years could potentially affect plant safety and these effects need to be better understood. As part of the KAERI-BNL collaboration, a condensate storage tank (CST) was analyzed to estimate its seismic fragility capacities under various postulated degradation scenarios. CSTs were shown to have a significant impact on the seismic core damage frequency of a nuclear power plant. The seismic fragility capacity of the CST was developed

  1. Seismic margin review of the Maine Yankee Atomic Power Station: Fragility analysis

    SciTech Connect

    Ravindra, M. K.; Hardy, G. S.; Hashimoto, P. S.; Griffin, M. J.

    1987-03-01

    This Fragility Analysis is the third of three volumes for the Seismic Margin Review of the Maine Yankee Atomic Power Station. Volume 1 is the Summary Report of the first trial seismic margin review. Volume 2, Systems Analysis, documents the results of the systems screening for the review. The three volumes are part of the Seismic Margins Program initiated in 1984 by the Nuclear Regulatory Commission (NRC) to quantify seismic margins at nuclear power plants. The overall objectives of the trial review are to assess the seismic margins of a particular pressurized water reactor, and to test the adequacy of this review approach, quantification techniques, and guidelines for performing the review. Results from the trial review will be used to revise the seismic margin methodology and guidelines so that the NRC and industry can readily apply them to assess the inherent quantitative seismic capacity of nuclear power plants.

  2. [Human vulnerability under cosmetic surgery. A bioethic analysis].

    PubMed

    Ramos-Rocha de Viesca, Mariablanca

    2012-01-01

    Cosmetic surgery is one of the best examples of the current health empowerment. Aesthetic surgical interventions have been criticized because they expose the healthy individual to an unnecessary risk. In modern society the body has turned into a beauty depository with a commercial value. In published bioethics papers, analyses of the cosmetic problem pointed their attention on the freedom, autonomy and distributive justice. Mexico occupies fifth place in the world of cosmetic surgeries. Vulnerability is an inherent condition of man's existence and marks the limit of human dignity. UNESCO agrees that some populations are more inclined to vulnerability. The aim of this work is to demonstrate that those who wish to make a physical change had given up to social coercion and psychological problems.

  3. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  4. Vulnerabilities, Influences and Interaction Paths: Failure Data for Integrated System Risk Analysis

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land

    2006-01-01

    We describe graph-based analysis methods for identifying and analyzing cross-subsystem interaction risks from subsystem connectivity information. By discovering external and remote influences that would be otherwise unexpected, these methods can support better communication among subsystem designers at points of potential conflict and to support design of more dependable and diagnosable systems. These methods identify hazard causes that can impact vulnerable functions or entities if propagated across interaction paths from the hazard source to the vulnerable target. The analysis can also assess combined impacts of And-Or trees of disabling influences. The analysis can use ratings of hazards and vulnerabilities to calculate cumulative measures of the severity and importance. Identification of cross-subsystem hazard-vulnerability pairs and propagation paths across subsystems will increase coverage of hazard and risk analysis and can indicate risk control and protection strategies.

  5. Detailed Analysis of the Interoccurrence Time Statistics in Seismic Activity

    NASA Astrophysics Data System (ADS)

    Tanaka, Hiroki; Aizawa, Yoji

    2017-02-01

    The interoccurrence time statistics of seismiciry is studied theoretically as well as numerically by taking into account the conditional probability and the correlations among many earthquakes in different magnitude levels. It is known so far that the interoccurrence time statistics is well approximated by the Weibull distribution, but the more detailed information about the interoccurrence times can be obtained from the analysis of the conditional probability. Firstly, we propose the Embedding Equation Theory (EET), where the conditional probability is described by two kinds of correlation coefficients; one is the magnitude correlation and the other is the inter-event time correlation. Furthermore, the scaling law of each correlation coefficient is clearly determined from the numerical data-analysis carrying out with the Preliminary Determination of Epicenter (PDE) Catalog and the Japan Meteorological Agency (JMA) Catalog. Secondly, the EET is examined to derive the magnitude dependence of the interoccurrence time statistics and the multi-fractal relation is successfully formulated. Theoretically we cannot prove the universality of the multi-fractal relation in seismic activity; nevertheless, the theoretical results well reproduce all numerical data in our analysis, where several common features or the invariant aspects are clearly observed. Especially in the case of stationary ensembles the multi-fractal relation seems to obey an invariant curve, furthermore in the case of non-stationary (moving time) ensembles for the aftershock regime the multi-fractal relation seems to satisfy a certain invariant curve at any moving times. It is emphasized that the multi-fractal relation plays an important role to unify the statistical laws of seismicity: actually the Gutenberg-Richter law and the Weibull distribution are unified in the multi-fractal relation, and some universality conjectures regarding the seismicity are briefly discussed.

  6. Kinematic Seismic Rupture Parameters from a Doppler Analysis

    NASA Astrophysics Data System (ADS)

    Caldeira, Bento; Bezzeghoud, Mourad; Borges, José F.

    2010-05-01

    The radiation emitted from extended seismic sources, mainly when the rupture spreads in preferred directions, presents spectral deviations as a function of the observation location. This aspect, unobserved to point sources, and named as directivity, are manifested by an increase in the frequency and amplitude of seismic waves when the rupture occurs in the direction of the seismic station and a decrease in the frequency and amplitude if it occurs in the opposite direction. The model of directivity that supports the method is a Doppler analysis based on a kinematic source model of rupture and wave propagation through a structural medium with spherical symmetry [1]. A unilateral rupture can be viewed as a sequence of shocks produced along certain paths on the fault. According this model, the seismic record at any point on the Earth's surface contains a signature of the rupture process that originated the recorded waveform. Calculating the rupture direction and velocity by a general Doppler equation, - the goal of this work - using a dataset of common time-delays read from waveforms recorded at different distances around the epicenter, requires the normalization of measures to a standard value of slowness. This normalization involves a non-linear inversion that we solve numerically using an iterative least-squares approach. The evaluation of the performance of this technique was done through a set of synthetic and real applications. We present the application of the method at four real case studies, the following earthquakes: Arequipa, Peru (Mw = 8.4, June 23, 2001); Denali, AK, USA (Mw = 7.8; November 3, 2002); Zemmouri-Boumerdes, Algeria (Mw = 6.8, May 21, 2003); and Sumatra, Indonesia (Mw = 9.3, December 26, 2004). The results obtained from the dataset of the four earthquakes agreed, in general, with the values presented by other authors using different methods and data. [1] Caldeira B., Bezzeghoud M, Borges JF, 2009; DIRDOP: a directivity approach to determining

  7. Network similarity and statistical analysis of earthquake seismic data

    NASA Astrophysics Data System (ADS)

    Deyasi, Krishanu; Chakraborty, Abhijit; Banerjee, Anirban

    2017-09-01

    We study the structural similarity of earthquake networks constructed from seismic catalogs of different geographical regions. A hierarchical clustering of underlying undirected earthquake networks is shown using Jensen-Shannon divergence in graph spectra. The directed nature of links indicates that each earthquake network is strongly connected, which motivates us to study the directed version statistically. Our statistical analysis of each earthquake region identifies the hub regions. We calculate the conditional probability of the forthcoming occurrences of earthquakes in each region. The conditional probability of each event has been compared with their stationary distribution.

  8. Near real-time seismic analysis using streaming software

    NASA Astrophysics Data System (ADS)

    Kropivnitskaya, Y. Y.; Qin, J.; Tiampo, K. F.; Bauer, M. A.

    2013-12-01

    Over the past several decades we have seen increases in the amount and quality of data available for use in the identification and analysis of natural hazards. Concern for the early warning of natural hazards has also stimulated research and development of new technologies for high-speed processing of this data in real-time or near real-time. Remote sensing data are widely used, including seismic data recorded by regional seismograph networks, GPS data and satellite imagery collected and archived from a number of different sources. For the purposes of warning systems, this data must be considered from the perspective of three dimensions - volume, processing speed and data diversity. A stream processing data-centric programming model, where the data can be viewed as a stream of inputs and analyzed by algorithms structured into data flows and where processing elements are pipelined and parallelized, allows us to take all three dimensions into account. Here we develop innovative algorithm techniques for the near real-time analysis of seismic network data, based on the InfoSphere Streams stream processing environment from IBM running on a cloud computing platform, to produce automated ground-shaking maps for large events. The ground motion parameters selected for display in the ground-shaking maps include peak ground acceleration (PGA), peak velocity (PGV), pseudo acceleration (PSA) amplitude at periods of 0.1s, 0.3s and 1.0s, as well as instrumentally derived felt-intensity. Our work focuses on testing the process of acquiring seismic data from various networks around the world. We use synthetic catalogs that incorporate the quality and statistics of historic catalogs as well as seismic networks both from well-instrumented regions (California, Japan) and from areas with sparser, shorter data sets (Canada). The catalogs have been tested in order to quantify time, to produce estimates of earthquake magnitude and location, and to determine the optimal level of pipelining

  9. Detection, Measurement, Visualization, and Analysis of Seismic Crustal Deformation

    NASA Technical Reports Server (NTRS)

    Crippen, R.; Blom, R.

    1995-01-01

    Remote sensing plays a key role in the analysis of seismic crustal deformation. Recently radar interferometry has been used to measure one dimension of the strain fields of earthquakes at a resolution of centimeters. Optical imagery is useful in measuring the strain fields in both geographic dimensions of the strain field down to 1/20 of pixel size, and soon will be capable of high resolution. Visual observation of fault motion from space can also be used to detect fault motion from aerial photographs.

  10. Detection, Measurement, Visualization, and Analysis of Seismic Crustal Deformation

    NASA Technical Reports Server (NTRS)

    Crippen, R.; Blom, R.

    1995-01-01

    Remote sensing plays a key role in the analysis of seismic crustal deformation. Recently radar interferometry has been used to measure one dimension of the strain fields of earthquakes at a resolution of centimeters. Optical imagery is useful in measuring the strain fields in both geographic dimensions of the strain field down to 1/20 of pixel size, and soon will be capable of high resolution. Visual observation of fault motion from space can also be used to detect fault motion from aerial photographs.

  11. Policies on Protecting Vulnerable People During Disasters in Iran: A Document Analysis

    PubMed Central

    Abbasi Dolatabadi, Zahra; Seyedin, Hesam; Aryankhesal, Aidin

    2016-01-01

    Context Developing official protection policies for disasters is a main strategy in protecting vulnerable people. The aim of this study was to analyze official documents concerning policies on protecting vulnerable people during disasters. Evidence Acquisition This study was conducted by the qualitative document analysis method. Documents were gathered by searching websites and referring to the organizations involved in disaster management. The documents were assessed by a researcher-made data collection form. A directed content analysis approach was used to analyze the retrieved documents regarding the protection policies and legislation for vulnerable people. Results A total of 22 documents were included in the final analysis. Most of the documents referred to women, children, elderly people, poor, and villagers as vulnerable people. Moreover, the documents did not provide information regarding official measures for protecting vulnerable people during different phases of disaster management. Conclusions A clear and comprehensive definition of “vulnerable people” and formulation of official policies to protect them is needs to be formulated. Given the high prevalence of disasters in Iran, policy makers need to develop effective context-based policies to protect vulnerable people during disasters. PMID:27921019

  12. Utilizing Semantic Big Data for realizing a National-scale Infrastructure Vulnerability Analysis System

    SciTech Connect

    Chinthavali, Supriya; Shankar, Mallikarjun

    2016-01-01

    Critical Infrastructure systems(CIs) such as energy, water, transportation and communication are highly interconnected and mutually dependent in complex ways. Robust modeling of CIs interconnections is crucial to identify vulnerabilities in the CIs. We present here a national-scale Infrastructure Vulnerability Analysis System (IVAS) vision leveraging Se- mantic Big Data (SBD) tools, Big Data, and Geographical Information Systems (GIS) tools. We survey existing ap- proaches on vulnerability analysis of critical infrastructures and discuss relevant systems and tools aligned with our vi- sion. Next, we present a generic system architecture and discuss challenges including: (1) Constructing and manag- ing a CI network-of-networks graph, (2) Performing analytic operations at scale, and (3) Interactive visualization of ana- lytic output to generate meaningful insights. We argue that this architecture acts as a baseline to realize a national-scale network based vulnerability analysis system.

  13. Seismic fragility analysis of buried steel piping at P, L, and K reactors

    SciTech Connect

    Wingo, H.E.

    1989-10-01

    Analysis of seismic strength of buried cooling water piping in reactor areas is necessary to evaluate the risk of reactor operation because seismic events could damage these buried pipes and cause loss of coolant accidents. This report documents analysis of the ability of this piping to withstand the combined effects of the propagation of seismic waves, the possibility that the piping may not behave in a completely ductile fashion, and the distortions caused by relative displacements of structures connected to the piping.

  14. Spectrum analysis of seismic surface waves and its applications in seismic landmine detection.

    PubMed

    Alam, Mubashir; McClellan, James H; Scott, Waymond R

    2007-03-01

    In geophysics, spectrum analysis of surface waves (SASW) refers to a noninvasive method for soil characterization. However, the term spectrum analysis can be used in a wider sense to mean a method for determining and identifying various modes of seismic surface waves and their properties such as velocity, polarization, etc. Surface waves travel along the free boundary of a medium and can be easily detected with a transducer placed on the free surface of the boundary. A new method based on vector processing of space-time data obtained from an array of triaxial sensors is proposed to produce high-resolution, multimodal spectra from surface waves. Then individual modes can be identified in the spectrum and reconstructed in the space-time domain; also, reflected waves can be separated easily from forward waves in the spectrum domain. This new SASW method can be used for detecting and locating landmines by analyzing the reflected waves for resonance. Processing examples are presented for numerically generated data, experimental data collected in a laboratory setting, and field data.

  15. Parental alienation syndrome. A developmental analysis of a vulnerable population.

    PubMed

    Price, J L; Pioske, K S

    1994-11-01

    1. Parental alienation syndrome is the systematic denigration by one parent of the other parent with the intent of alienating the child. 2. Parents who engage in alienating activity have experienced loss, leading to depression, anger, and aggression. The family system experiences loss during divorce and is adversely affected by the alienating activities of one parent. 3. Understanding the dynamics of parental alienation syndrome will position the nurse to recognize it as a symptom of depression and dependence, and bring care to the vulnerable population.

  16. Seismic vulnerability assessment of a steel-girder highway bridge equipped with different SMA wire-based smart elastomeric isolators

    NASA Astrophysics Data System (ADS)

    Hedayati Dezfuli, Farshad; Shahria Alam, M.

    2016-07-01

    Shape memory alloy wire-based rubber bearings (SMA-RBs) possess enhanced energy dissipation capacity and self-centering property compared to conventional RBs. The performance of different types of SMA-RBs with different wire configurations has been studied in detail. However, their reliability in isolating structures has not been thoroughly investigated. The objective of this study is to analytically explore the effect of SMA-RBs on the seismic fragility of a highway bridge. Steel-reinforced elastomeric isolators are equipped with SMA wires and used to isolate the bridge. Results revealed that SMA wires with a superelastic behavior and re-centering capability can increase the reliability of the bearing and the bridge structure. It was observed that at the collapse level of damage, the bridge isolated by SMA-HDRB has the lowest fragility. Findings also showed that equipping NRB with SMA wires decreases the possibility of damage in the bridge while, replacing HDRB with SMA-HDRB; or LRB with SMA-LRB increases the failure probability of the system at slight, moderate, and extensive limit states.

  17. Identifying typical patterns of vulnerability: A 5-step approach based on cluster analysis

    NASA Astrophysics Data System (ADS)

    Sietz, Diana; Lüdeke, Matthias; Kok, Marcel; Lucas, Paul; Carsten, Walther; Janssen, Peter

    2013-04-01

    Specific processes that shape the vulnerability of socio-ecological systems to climate, market and other stresses derive from diverse background conditions. Within the multitude of vulnerability-creating mechanisms, distinct processes recur in various regions inspiring research on typical patterns of vulnerability. The vulnerability patterns display typical combinations of the natural and socio-economic properties that shape a systems' vulnerability to particular stresses. Based on the identification of a limited number of vulnerability patterns, pattern analysis provides an efficient approach to improving our understanding of vulnerability and decision-making for vulnerability reduction. However, current pattern analyses often miss explicit descriptions of their methods and pay insufficient attention to the validity of their groupings. Therefore, the question arises as to how do we identify typical vulnerability patterns in order to enhance our understanding of a systems' vulnerability to stresses? A cluster-based pattern recognition applied at global and local levels is scrutinised with a focus on an applicable methodology and practicable insights. Taking the example of drylands, this presentation demonstrates the conditions necessary to identify typical vulnerability patterns. They are summarised in five methodological steps comprising the elicitation of relevant cause-effect hypotheses and the quantitative indication of mechanisms as well as an evaluation of robustness, a validation and a ranking of the identified patterns. Reflecting scale-dependent opportunities, a global study is able to support decision-making with insights into the up-scaling of interventions when available funds are limited. In contrast, local investigations encourage an outcome-based validation. This constitutes a crucial step in establishing the credibility of the patterns and hence their suitability for informing extension services and individual decisions. In this respect, working at

  18. Seismic analysis of wind turbines in the time domain

    NASA Astrophysics Data System (ADS)

    Witcher, D.

    2005-01-01

    The analysis of wind turbine loading associated with earthquakes is clearly important when designing for and assessing the feasibility of wind farms in seismically active regions. The approach taken for such analysis is generally based on codified methods which have been developed for the assessment of seismic loads acting on buildings. These methods are not able to deal properly with the aeroelastic interaction of the dynamic motion of the wind turbine structure with either the wind loading acting on the rotor blades or the response of the turbine controller. This article presents an alternative approach, which is to undertake the calculation in the time domain. In this case a full aeroelastic model of the wind turbine subject to turbulent wind loading is further excited by ground motion corresponding to the earthquake. This capability has been introduced to the GH Bladed wind turbine simulation package. The software can be used to compute the combined wind and earthquake loading of a wind turbine given a definition of the external conditions for an appropriate series of load cases. This article discusses the method and presents example results. Copyright

  19. Uncertainty analysis for seismic hazard in Northern and Central Italy

    USGS Publications Warehouse

    Lombardi, A.M.; Akinci, A.; Malagnini, L.; Mueller, C.S.

    2005-01-01

    In this study we examine uncertainty and parametric sensitivity of Peak Ground Acceleration (PGA) and 1-Hz Spectral Acceleration (1-Hz SA) in probabilistic seismic hazard maps (10% probability of exceedance in 50 years) of Northern and Central Italy. The uncertainty in hazard is estimated using a Monte Carlo approach to randomly sample a logic tree that has three input-variables branch points representing alternative values for b-value, maximum magnitude (Mmax) and attenuation relationships. Uncertainty is expressed in terms of 95% confidence band and Coefficient Of Variation (COV). The overall variability of ground motions and their sensitivity to each parameter of the logic tree are investigated. The largest values of the overall 95% confidence band are around 0.15 g for PGA in the Friuli and Northern Apennines regions and around 0.35 g for 1-Hz SA in the Central Apennines. The sensitivity analysis shows that the largest contributor to seismic hazard variability is uncertainty in the choice of ground-motion attenuation relationships, especially in the Friuli Region (???0.10 g) for PGA and in the Friuli and Central Apennines regions (???0.15 g) for 1-Hz SA. This is followed by the variability of the b-value: its main contribution is evident in the Friuli and Central Apennines regions for both 1-Hz SA (???0.15 g) and PGA (???0.10 g). We observe that the contribution of Mmax to seismic hazard variability is negligible, at least for 10% exceedance in 50-years hazard. The overall COV map for PGA shows that the uncertainty in the hazard is larger in the Friuli and Northern Apennine regions, around 20-30%, than the Central Apennines and Northwestern Italy, around 10-20%. The overall uncertainty is larger for the 1-Hz SA map and reaches 50-60% in the Central Apennines and Western Alps.

  20. MSNoise: A framework for Continuous Seismic Noise Analysis

    NASA Astrophysics Data System (ADS)

    Lecocq, Thomas; Caudron, Corentin; De Plaen, Raphaël; Mordret, Aurélien

    2016-04-01

    MSNoise is an Open and Free Python package known to be the only complete integrated workflow designed to analyse ambient seismic noise and study relative velocity changes (dv/v) in the crust. It is based on state of the art and well maintained Python modules, among which ObsPy plays an important role. To our knowledge, it is officially used for continuous monitoring at least in three notable places: the Observatory of the Piton de la Fournaise volcano (OVPF, France), the Auckland Volcanic Field (New Zealand) and on the South Napa earthquake (Berkeley, USA). It is also used by many researchers to process archive data to focus e.g. on fault zones, intraplate Europe, geothermal exploitations or Antarctica. We first present the general working of MSNoise, originally written in 2010 to automatically scan data archives and process seismic data in order to produce dv/v time series. We demonstrate that its modularity provides a new potential to easily test new algorithms for each processing step. For example, one could experiment new methods of cross-correlation (done by default in the frequency domain), stacking (default is linear stacking, averaging), or dv/v estimation (default is moving window cross-spectrum "MWCS", so-called "doublet"), etc. We present the last major evolution of MSNoise from a "single workflow: data archive to dv/v" to a framework system that allows plugins and modules to be developed and integrated into the MSNoise ecosystem. Small-scale plugins will be shown as examples, such as "continuous PPSD" (à la McNamarra & Buland) or "Seismic Amplitude Ratio Analysis" (Taisne, Caudron). We will also present the new MSNoise-TOMO package, using MSNoise as a "cross-correlation" toolbox and demystifying surface wave tomography ! Finally, the poster will be a meeting point for all those using or willing to use MSNoise, to meet the developer, exchange ideas and wishes !

  1. DARHT Multi-intelligence Seismic and Acoustic Data Analysis

    SciTech Connect

    Stevens, Garrison Nicole; Van Buren, Kendra Lu; Hemez, Francois M.

    2016-07-21

    The purpose of this report is to document the analysis of seismic and acoustic data collected at the Dual-Axis Radiographic Hydrodynamic Test (DARHT) facility at Los Alamos National Laboratory for robust, multi-intelligence decision making. The data utilized herein is obtained from two tri-axial seismic sensors and three acoustic sensors, resulting in a total of nine data channels. The goal of this analysis is to develop a generalized, automated framework to determine internal operations at DARHT using informative features extracted from measurements collected external of the facility. Our framework involves four components: (1) feature extraction, (2) data fusion, (3) classification, and finally (4) robustness analysis. Two approaches are taken for extracting features from the data. The first of these, generic feature extraction, involves extraction of statistical features from the nine data channels. The second approach, event detection, identifies specific events relevant to traffic entering and leaving the facility as well as explosive activities at DARHT and nearby explosive testing sites. Event detection is completed using a two stage method, first utilizing signatures in the frequency domain to identify outliers and second extracting short duration events of interest among these outliers by evaluating residuals of an autoregressive exogenous time series model. Features extracted from each data set are then fused to perform analysis with a multi-intelligence paradigm, where information from multiple data sets are combined to generate more information than available through analysis of each independently. The fused feature set is used to train a statistical classifier and predict the state of operations to inform a decision maker. We demonstrate this classification using both generic statistical features and event detection and provide a comparison of the two methods. Finally, the concept of decision robustness is presented through a preliminary analysis where

  2. Using multi-criteria decision analysis to assess the vulnerability of drinking water utilities.

    PubMed

    Joerin, Florent; Cool, Geneviève; Rodriguez, Manuel J; Gignac, Marc; Bouchard, Christian

    2010-07-01

    Outbreaks of microbiological waterborne disease have increased governmental concern regarding the importance of drinking water safety. Considering the multi-barrier approach to safe drinking water may improve management decisions to reduce contamination risks. However, the application of this approach must consider numerous and diverse kinds of information simultaneously. This makes it difficult for authorities to apply the approach to decision making. For this reason, multi-criteria decision analysis can be helpful in applying the multi-barrier approach to vulnerability assessment. The goal of this study is to propose an approach based on a multi-criteria analysis method in order to rank drinking water systems (DWUs) based on their vulnerability to microbiological contamination. This approach is illustrated with an application carried out on 28 DWUs supplied by groundwater in the Province of Québec, Canada. The multi-criteria analysis method chosen is measuring attractiveness by a categorical based evaluation technique methodology allowing the assessment of a microbiological vulnerability indicator (MVI) for each DWU. Results are presented on a scale ranking DWUs from less vulnerable to most vulnerable to contamination. MVI results are tested using a sensitivity analysis on barrier weights and they are also compared with historical data on contamination at the utilities. The investigation demonstrates that MVI provides a good representation of the vulnerability of DWUs to microbiological contamination.

  3. Assessing the Climate Change Vulnerability of Physical Infrastructures through a Spatial Analysis

    NASA Astrophysics Data System (ADS)

    Myeong, S.

    2012-12-01

    Natural hazards can destroy or damage physical infrastructures and thus incur socioeconomic losses and threaten the safety of people. Therefore, identifying the vulnerability of a given society's physical infrastructure to climate change and developing appropriate adaptation measures are necessary. A recent trend of climate change vulnerability assessment has shifted its focus from the index-based assessment to the spatial analysis of the vulnerability to climate change in order to see the distribution of vulnerable areas. Although some research has been conducted on the US and Southwestern Asia, no formal research has been conducted on Korea that assessed the vulnerable areas in terms of spatial distribution. The current study attempts to see what types of vulnerability exist in what areas of the country through an analysis of data gathered from different sectors of Korea. Three domains, i.e., sensitivity, exposure, and adaptive capacity, were investigated, with subordinate component data under each domain, to assess the vulnerability of the country. The results showed that the vulnerability degree differs between coastal areas and inland areas. For most subordinate components, coastal areas were more vulnerable than inland areas. Within the inland areas, less urbanized areas were more sensitive to the climate change than more urbanized areas, while large metropolitan areas were exposed more to the climate change due to the density of physical infrastructures. Some southern areas of the country had greater adaptive capacity economically and institutionally; however, Seoul and its vicinity had greater adaptive capacity related to physical infrastructures. The study concludes that since damages from natural disasters such as floods and typhoons are becoming increasingly serious around the world as well as in Korea, it is necessary to develop appropriate measures for physical infrastructure to adapt to the climate change, customized to the specific needs of different

  4. Probabilistic Seismic Hazard Analysis for Southern California Coastal Facilities

    SciTech Connect

    Savy, J; Foxall, B

    2004-04-16

    The overall objective of this study was to develop probabilistic seismic hazard estimates for the coastal and offshore area of Ventura, Los Angeles and Orange counties for use as a basis for the University of Southern California (USC) to develop physical models of tsunami for the coastal regions and by the California State Lands Commission (SLC) to develop regulatory standards for seismic loading and liquefaction evaluation of marine oil terminals. The probabilistic seismic hazard analysis (PSHA) was carried out by the Lawrence Livermore National Laboratory (LLNL), in several phases over a time period of two years, following the method developed by LLNL for the estimation of seismic hazards at Department Of Energy (DOE) facilities, and for 69 locations of nuclear plants in the Eastern United States, for the Nuclear Regulatory Commission (NRC). This method consists in making maximum use of all physical data (qualitative, and quantitative) and to characterize the uncertainties by using a set of alternate spatiotemporal models of occurrence of future earthquakes, as described in the SSHAC, PSHA Guidance Document (Budnitz et al., 1997), and implemented for the NRC (Savy et al., 2002). In general, estimation of seismic hazard is based not only on our understanding of the regional tectonics and detailed characterization of the faults in the area but also on the analysis methods employed and the types of physical and empirical models that are deemed appropriate for the analysis. To develop this understanding, the body of knowledge in the scientific community is sampled in a series of workshops with a group of experts representative of the entire scientific community, including geologists and seismologists from the United States Geological Survey (USGS), members of the South California Earthquake Center (SCEC), and members of academic institutions (University of California Santa-Cruz, Stanford, UC Santa Barbara, and University of Southern California), and members of

  5. Latest development in seismic texture analysis for subsurface structure, facies, and reservoir characterization: A review

    SciTech Connect

    Gao, Dengliang

    2011-03-01

    In exploration geology and geophysics, seismic texture is still a developing concept that has not been sufficiently known, although quite a number of different algorithms have been published in the literature. This paper provides a review of the seismic texture concepts and methodologies, focusing on latest developments in seismic amplitude texture analysis, with particular reference to the gray level co-occurrence matrix (GLCM) and the texture model regression (TMR) methods. The GLCM method evaluates spatial arrangements of amplitude samples within an analysis window using a matrix (a two-dimensional histogram) of amplitude co-occurrence. The matrix is then transformed into a suite of texture attributes, such as homogeneity, contrast, and randomness, which provide the basis for seismic facies classification. The TMR method uses a texture model as reference to discriminate among seismic features based on a linear, least-squares regression analysis between the model and the data within an analysis window. By implementing customized texture model schemes, the TMR algorithm has the flexibility to characterize subsurface geology for different purposes. A texture model with a constant phase is effective at enhancing the visibility of seismic structural fabrics, a texture model with a variable phase is helpful for visualizing seismic facies, and a texture model with variable amplitude, frequency, and size is instrumental in calibrating seismic to reservoir properties. Preliminary test case studies in the very recent past have indicated that the latest developments in seismic texture analysis have added to the existing amplitude interpretation theories and methodologies. These and future developments in seismic texture theory and methodologies will hopefully lead to a better understanding of the geologic implications of the seismic texture concept and to an improved geologic interpretation of reflection seismic amplitude

  6. Requalification analysis of a circular composite slab for seismic load

    SciTech Connect

    Srinivasan, M.G.; Kot, C.A.

    1992-11-01

    The circular roof slab of an existing facility was analyzed to requalify the structure for supporting a significant seismic load that it was not originally designed for. The slab has a clear span of 66 ft and consists of a 48 in thick reinforced concrete member and a steel liner plate. Besides a number of smaller penetrations, the slab contains two significant cutouts: a 9 ft square opening and a 3 ft dia hole. The issues that complicated the analysis of this non-typical structure, i.e., composite action and nonlinear stiffness of reinforced concrete (R. C.) sections, are discussed. It was possible to circumvent the difficulties by making conservative and simplifying assumptions. If codes incorporate guidelines on practical methods for dynamic analysis of R. C. structures, some of the unneeded conservatism could be eliminated in future designs.

  7. Probabilistic Seismic Hazard Analysis: Adaptation for CO2 Sequestration Sites

    NASA Astrophysics Data System (ADS)

    Vasudevan, K.; Eaton, D. W.

    2011-12-01

    Large-scale sequestration of CO2 in depleted oil and gas fields in sedimentary basins such as the Western Canada Sedimentary Basin (WCSB) and in particular, central Alberta, should consider, among other safety and risk issues, a seismic hazard analysis that would include potential ground motions induced by earthquakes. The region is juxtaposed to major tectonically active seismogenic zones such as the Cascadia Subduction Zone, the Queen Charlotte Fault Zone, and the northern Cordillera region. Hazards associated with large-scale storage from strong ground motions caused by large-magnitude earthquakes along the west coast of Canada, and/or medium-to-large magnitude earthquakes triggered by such earthquakes in the neighbourhood of the storage site, must be clearly understood. To this end, stochastic modeling of the accelerograms recorded during large magnitude earthquakes in western Canada has been undertaken. A lack of recorded accelerograms and the absence of a catalogue of ground-motion prediction equations similar to the Next Generation Attenuation (NGA) database, however, hamper such analysis for the WCSB. In order to generate our own database of ground-motions for probabilistic seismic hazard analysis, we employ a site-based stochastic simulation approach. We use it to simulate three-component ground-motion accelerograms recorded during the November 3, 2002 Denali earthquake to mimic the Queen Charlotte Fault earthquakes. To represent a Cascadia megathrust earthquake, we consider three-component strong-motion accelerograms recorded during the March 11, 2011 Tohoku earthquake in Japan. Finally, to simulate an event comparable to the thrust-style Kinbasket Lake earthquake of 1908, we use three-component ground-motion accelerograms recorded during the 1985 Nahanni earthquake and the 2004 Chuetsu earthquake. Here, we develop predictive equations for the stochastic model parameters that describe ground motions in terms of earthquake and site characteristics such as

  8. Seismic Analysis of Three Bomb Explosions in Turkey

    NASA Astrophysics Data System (ADS)

    Necmioglu, O.; Semin, K. U.; Kocak, S.; Destici, C.; Teoman, U.; Ozel, N. M.

    2016-12-01

    Seismic analysis of three vehicle-installed bomb explosions occurred on 13 March 2016 in Ankara, 12 May 2016 in Diyarbakır and 9 July 2016 in Mardin have been conducted using data from the nearest stations (LOD, DYBB and MAZI) of the Boğaziçi University - Kandilli Observatory and Earthquake Research Institute's (KOERI) seismic network and compared with low-magnitude earthquakes in similar distance based on phase readings and frequency content. Amplitude spectra has been compared through Fourier transformation and earthquake-explosion frequency discrimination has been performed using various filter bands. Time-domain and spectral analysis have been performed using Geotool software provided by CTBTO. Local magnitude (ML) values have been calculated for each explosion by removing instrument-response and adding Wood-Anderson type instrument response. Approximate amount of explosives used in these explosions have been determined using empirical methods of Koper (2002). Preliminary results indicated that 16 tons TNT equivalent explosives have been used in 12 May 2016 Diyarbakır explosion, which is very much in accordance with the media reports claiming 15 tons of TNT. Our analysis for 9 July 2016 Mardin explosion matched the reported 5 tons of explosives. Results concerning 13 March 2016 Ankara explosion indicated that approximately 1,7 ton of TNT equivalent explosives were used in the attack whereas security and intelligence reports claimed 300 kg explosives as a combination of TNT, RDX and ammonium nitrate. The overestimated results obtained in our analysis for the Ankara explosion may be related due to i) high relative effectiveness factor of the RDX component of the explosive ii) inefficiency of Koper (2002) method in lower yields (since the method was developed using explosions with yields of 3-12 tons of TNT), iii) combination of both.

  9. The Application of Urban System Analysis To The Seismic Risk Assessment of Barcelona

    NASA Astrophysics Data System (ADS)

    Irizarry, J.; Marturia, J.; Mena, U.

    The Urban System Analysis methodology developed by an international team directed by the BRGM (France) within the GEMITIS research program (1996-1999) consists of a global and integrated risk reduction strategy in order to improve the risk assessment effectiveness through the use of a geographical information system (GIS). A typical seismic risk analysis evaluates only the direct impacts of the earthquake, but gives few information about how to improve preventive effectiveness because it does not provide any data about the indirect consequences of an earthquake. This new methodology proposes to complete the scenario generation process by focusing physical vulnerability assessment on the essential elements for urban functioning. Besides it gives rational bases for the definition of appropriate risk management and preventive action plans. This study presents a first stage of the application of the Urban System Analysis method to the city of Barcelona. The elements at risk in the city like the residential, commercial, administrative, and industrials areas are identified and analyzed to define the weak points within its urban system. Then the most critical elements within the city's urban system will be identified to establish priorities for risk managements plans in the city.

  10. Multi-hole seismic modeling in 3-D space and cross-hole seismic tomography analysis for boulder detection

    NASA Astrophysics Data System (ADS)

    Cheng, Fei; Liu, Jiangping; Wang, Jing; Zong, Yuquan; Yu, Mingyu

    2016-11-01

    A boulder stone, a common geological feature in south China, is referred to the remnant of a granite body which has been unevenly weathered. Undetected boulders could adversely impact the schedule and safety of subway construction when using tunnel boring machine (TBM) method. Therefore, boulder detection has always been a key issue demanded to be solved before the construction. Nowadays, cross-hole seismic tomography is a high resolution technique capable of boulder detection, however, the method can only solve for velocity in a 2-D slice between two wells, and the size and central position of the boulder are generally difficult to be accurately obtained. In this paper, the authors conduct a multi-hole wave field simulation and characteristic analysis of a boulder model based on the 3-D elastic wave staggered-grid finite difference theory, and also a 2-D imaging analysis based on first arrival travel time. The results indicate that (1) full wave field records could be obtained from multi-hole seismic wave simulations. Simulation results describe that the seismic wave propagation pattern in cross-hole high-velocity spherical geological bodies is more detailed and can serve as a basis for the wave field analysis. (2) When a cross-hole seismic section cuts through the boulder, the proposed method provides satisfactory cross-hole tomography results; however, when the section is closely positioned to the boulder, such high-velocity object in the 3-D space would impact on the surrounding wave field. The received diffracted wave interferes with the primary wave and in consequence the picked first arrival travel time is not derived from the profile, which results in a false appearance of high-velocity geology features. Finally, the results of 2-D analysis in 3-D modeling space are comparatively analyzed with the physical model test vis-a-vis the effect of high velocity body on the seismic tomographic measurements.

  11. Assessing the Performance of a Classification-Based Vulnerability Analysis Model.

    PubMed

    Wang, Tai-ran; Mousseau, Vincent; Pedroni, Nicola; Zio, Enrico

    2015-09-01

    In this article, a classification model based on the majority rule sorting (MR-Sort) method is employed to evaluate the vulnerability of safety-critical systems with respect to malevolent intentional acts. The model is built on the basis of a (limited-size) set of data representing (a priori known) vulnerability classification examples. The empirical construction of the classification model introduces a source of uncertainty into the vulnerability analysis process: a quantitative assessment of the performance of the classification model (in terms of accuracy and confidence in the assignments) is thus in order. Three different app oaches are here considered to this aim: (i) a model-retrieval-based approach, (ii) the bootstrap method, and (iii) the leave-one-out cross-validation technique. The analyses are presented with reference to an exemplificative case study involving the vulnerability assessment of nuclear power plants.

  12. Livelihood security, vulnerability and resilience: a historical analysis of Chibuene, southern Mozambique.

    PubMed

    Ekblom, Anneli

    2012-07-01

    A sustainable livelihood framework is used to analyse livelihood security, vulnerability and resilience in the village of Chibuene, Vilanculos, southern Mozambique from a historical and contemporary perspective. Interviews, assessments, archaeology, palaeoecology and written sources are used to address tangible and intangible aspects of livelihood security. The analysis shows that livelihood strategies for building resilience, diversification of resource use, social networks and trade, have long historical continuities. Vulnerability is contingent on historical processes as long-term socio-environmental insecurity and resultant biodiversity loss. These contingencies affect the social capacity to cope with vulnerability in the present. The study concludes that contingency and the extent and strength of social networks should be added as a factor in livelihood assessments. Furthermore, policies for mitigating vulnerability must build on the reality of environmental insecurity, and strengthen local structures that diversify and spread risk.

  13. Exploring drought vulnerability in Africa: an indicator based analysis to inform early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2013-10-01

    Drought vulnerability is a complex concept that includes both biophysical and socio-economic drivers of drought impact that determine capacity to cope with drought. In order to develop an efficient drought early warning system and to be prepared to mitigate upcoming drought events it is important to understand the drought vulnerability of the affected regions. We propose a composite Drought Vulnerability Indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level in four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation is less vulnerable to drought; furthermore we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided in three main different

  14. Flexible Software Architecture for Visualization and Seismic Data Analysis

    NASA Astrophysics Data System (ADS)

    Petunin, S.; Pavlov, I.; Mogilenskikh, D.; Podzyuban, D.; Arkhipov, A.; Baturuin, N.; Lisin, A.; Smith, A.; Rivers, W.; Harben, P.

    2007-12-01

    Research in the field of seismology requires software and signal processing utilities for seismogram manipulation and analysis. Seismologists and data analysts often encounter a major problem in the use of any particular software application specific to seismic data analysis: the tuning of commands and windows to the specific waveforms and hot key combinations so as to fit their familiar informational environment. The ability to modify the user's interface independently from the developer requires an adaptive code structure. An adaptive code structure also allows for expansion of software capabilities such as new signal processing modules and implementation of more efficient algorithms. Our approach is to use a flexible "open" architecture for development of geophysical software. This report presents an integrated solution for organizing a logical software architecture based on the Unix version of the Geotool software implemented on the Microsoft NET 2.0 platform. Selection of this platform greatly expands the variety and number of computers that can implement the software, including laptops that can be utilized in field conditions. It also facilitates implementation of communication functions for seismic data requests from remote databases through the Internet. The main principle of the new architecture for Geotool is that scientists should be able to add new routines for digital waveform analysis via software plug-ins that utilize the basic Geotool display for GUI interaction. The use of plug-ins allows the efficient integration of diverse signal-processing software, including software still in preliminary development, into an organized platform without changing the fundamental structure of that platform itself. An analyst's use of Geotool is tracked via a metadata file so that future studies can reconstruct, and alter, the original signal processing operations. The work has been completed in the framework of a joint Russian- American project.

  15. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    NASA Astrophysics Data System (ADS)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria. ********* ;Without an analysis of the physical causes of recorded floods, and of the whole geophysical, biophysical and anthropogenic context which circumscribes the potential for flood formation, results of flood frequency analysis as [now practiced], rather than providing information useful for coping with the flood hazard, themselves represent an additional hazard that can contribute to damages caused by floods. This danger is very real since decisions made on the basis of wrong numbers presented as good estimates of flood probabilities will generally be worse than decisions made with an awareness of an impossibility to make a good estimate and with the aid of merely qualitative information on the general flooding potential.;

  16. Spatio-temporal earthquake risk assessment for the Lisbon Metropolitan Area - A contribution to improving standard methods of population exposure and vulnerability analysis

    NASA Astrophysics Data System (ADS)

    Freire, Sérgio; Aubrecht, Christoph

    2010-05-01

    The recent 7.0 M earthquake that caused severe damage and destruction in parts of Haiti struck close to 5 PM (local time), at a moment when many people were not in their residences, instead being in their workplaces, schools, or churches. Community vulnerability assessment to seismic hazard relying solely on the location and density of resident-based census population, as is commonly the case, would grossly misrepresent the real situation. In particular in the context of global (climate) change, risk analysis is a research field increasingly gaining in importance whereas risk is usually defined as a function of hazard probability and vulnerability. Assessment and mapping of human vulnerability has however generally been lagging behind hazard analysis efforts. Central to the concept of vulnerability is the issue of human exposure. Analysis of exposure is often spatially tied to administrative units or reference objects such as buildings, spanning scales from the regional level to local studies for small areas. Due to human activities and mobility, the spatial distribution of population is time-dependent, especially in metropolitan areas. Accurately estimating population exposure is a key component of catastrophe loss modeling, one element of effective risk analysis and emergency management. Therefore, accounting for the spatio-temporal dynamics of human vulnerability correlates with recent recommendations to improve vulnerability analyses. Earthquakes are the prototype for a major disaster, being low-probability, rapid-onset, high-consequence events. Lisbon, Portugal, is subject to a high risk of earthquake, which can strike at any day and time, as confirmed by modern history (e.g. December 2009). The recently-approved Special Emergency and Civil Protection Plan (PEERS) is based on a Seismic Intensity map, and only contemplates resident population from the census as proxy for human exposure. In the present work we map and analyze the spatio-temporal distribution of

  17. Analysis of Vulnerability Around The Colima Volcano, MEXICO

    NASA Astrophysics Data System (ADS)

    Carlos, S. P.

    2001-12-01

    The Colima volcano located in the western of the Trasmexican Volcanic Belt, in the central portion of the Colima Rift Zone, between the Mexican States of Jalisco and Colima. The volcano since January of 1998 presents a new activity, which has been characterized by two stages: the first one was an effusive phase that begin on 20 November 1998 and finish by the middle of January 1999. On February 10of 1999 a great explosion in the summit marked the beginning of an explosive phase, these facts implies that the eruptive process changes from an effusive model to an explosive one. Suárez-Plascencia et al, 2000, present hazard maps to ballistic projectiles, ashfalls and lahars for this scenario. This work presents the evaluation of the vulnerability in the areas identified as hazardous in the maps for ballistic, ashfalls and lahars, based on the economic elements located in the middle and lower sections of the volcano building, like agriculture, forestry, agroindustries and communication lines (highways, power, telephonic, railroad, etc). The method is based in Geographic Information Systems, using digital cartography scale 1:50,000, digital orthophotos from the Instituto Nacional de Estadística, Geografía e Informática, SPOT and Landsat satellite images from 1997 and 2000 in the bands 1, 2 and 3. The land use maps obtained for 1997 and 2000, were compared with the land use map reported by Suárez in 1992, from these maps an increase of the 5 porcent of the sugar cane area and corn cultivations were observed compared of those of 1990 (1225.7 km2) and a decrease of the forest surface, moving the agricultural limits uphill, and showing also some agave cultivation in the northwest and north hillslopes of the Nevado de Colima. This increment of the agricultural surface results in bigger economic activity in the area, which makes that the vulnerability also be increased to different volcanic products emitted during this phase of activity. The degradation of the soil by the

  18. Two-way traveltime analysis for seismic reservoir characterization

    NASA Astrophysics Data System (ADS)

    Sil, Samik

    Two-way traveltime (TWT) is one of the most important seismic attributes for reservoir characterization. Erroneous analysis of TWT can lead to incorrect estimates of velocity models resulting in improper structural interpretation of the subsurface. TWT analysis starts with the most fundamental step of seismic data processing, namely, Normal Moveout (NMO) correction. NMO correction is generally performed in the offset-time (X-t) domain, by fitting a hyperbolic curve to the observed traveltime corresponding to each reflection event. The performance of NMO correction depends on the quality of the data in the prestack domain and the underlying geology. When ideal data sets are available (high signal to noise ratio), and underlying geology is simple (flat layers), the NMO correction can still be erroneous due to (1) its long offset non-hyperbolic behavior, and (2) due to the presence of seismic anisotropy. Even though in the X-t domain several equations have been developed to account for seismic anisotropy induced non-hyperbolic move out, they are prone to error, when multiple anisotropic and isotropic layers are present. The non-hyperbolic equations for moveout corrections are also approximate as they are some form of truncated Taylor series and can only estimate effective root mean square (rms) parameters for each reflection event. In the plane wave (tau-p) domain, the estimation of layer parameters can be done using an exact equation for delay-time free from the approximation errors present in the X-t domain. In this domain a layer striping approach can also be used to account for the presence of multiple anisotropic and isotropic layers. Thus it is lucrative to develop NMO correction equation in the tau-p domain for an anisotropic medium, which in its limiting case can be useful for the isotropic medium as well. The simplest anisotropic media are Transversely Isotropic (TI) media which are also common in exploration seismology. One of the TI media, with a vertical

  19. Earthquake prediction in Japan and natural time analysis of seismicity

    NASA Astrophysics Data System (ADS)

    Uyeda, S.; Varotsos, P.

    2011-12-01

    ' (SES) data are available as in Greece, the natural time analysis of the seismicity after the initiation of the SES allows the determination of the time window of the impending mainshock through the evolution of the value of κ1 itself. It was found to work also for the 1989 M7.1 Loma Prieta earthquake. If SES data are not available, we solely rely on the evolution of the fluctuations of κ1 obtained by computing κ1 values using a natural time window of certain length sliding through the earthquake catalog. The fluctuations of the order parameter, in terms of variability, i. e., standard deviation divided by average, was found to increase dramatically when approaching the 11 March M9 super- giant earthquake. In fact, such increase was also found for M7.1 Kobe in 1995, M8.0 Tokachi-oki in 2003 and Landers and Hector-Mines earthquakes in Southern California. It is worth mentioning that such increase is obtained straghtforwardly from ordinary earthquake catalogs without any adjustable parameters.

  20. Seismic stratigraphy and regional unconformity analysis of Chukchi Sea Basins

    NASA Astrophysics Data System (ADS)

    Agasheva, Mariia; Karpov, Yury; Stoupakova, Antonina; Suslova, Anna

    2017-04-01

    Russian Chukchi Sea Shelf one of petroleum potential province and still one of the most uninvestigated area. North and Sough Chukchi Trough that separated by Wrangel-Hearld Arch have different origin. The main challenge is stratigraphic sequences determination that filled North and South Chukchi basins. The joint tectonic evolution of the territory as Canada basin opening and Brooks Range-Wrangel Herald orogenic events enable to expect the analogous stratigraphy sequences in Russian Part. Analysis of 2D seismic data of Russian and American Chukchi Sea represent the major seismic reflectance that traced throughout the basins. Referring to this data North Chukchi basin includes four seismic stratigraphic sequences - Franklian (pre-Mississippian), Ellesmirian (Upper Devonian-Jurassic), Beaufortian (Jurassic-Lower Cretaceous) and Brookian (Lower Cretaceous-Cenozoic), as it is in North Slope Alaska [1]. South Chukchi basin has different tectonic nature, representing only Franclian basement and Brookian sequences. Sedimentary cover of North Chukchi basins starts with Ellesmirian sequence it is marked by bright reflector that separates from chaotic folded Franklian sequence. Lower Ellesmirian sequence fills of grabens that formed during upper Devonian rifting. Devonian extension event was initiated as a result of Post-Caledonian orogenic collapse, terminating with the opening of Arctic oceans. Beaufortian sequence is distinguished in Colville basin and Hanna Trough by seismically defined clinoforms. Paleozoic and Mesozoic strata are eroded by regional Lower Cretaceous Unconformity (LCU) linked with Canada basin opening. LCU is defined at seismic by angular unconformity, tracing at most arctic basins. Lower Cretaceous erosion and uplift event are of Hauterivian to Aptian age in Brooks Range and the Loppa High uplift refer to the early Barremian. The Lower Cretaceous clinoform complex downlaps to LCU horizon and filling North Chukchi basin (as in Colville basin Alska

  1. Surface-Source Downhole Seismic Analysis in R

    USGS Publications Warehouse

    Thompson, Eric M.

    2007-01-01

    This report discusses a method for interpreting a layered slowness or velocity model from surface-source downhole seismic data originally presented by Boore (2003). I have implemented this method in the statistical computing language R (R Development Core Team, 2007), so that it is freely and easily available to researchers and practitioners that may find it useful. I originally applied an early version of these routines to seismic cone penetration test data (SCPT) to analyze the horizontal variability of shear-wave velocity within the sediments in the San Francisco Bay area (Thompson et al., 2006). A more recent version of these codes was used to analyze the influence of interface-selection and model assumptions on velocity/slowness estimates and the resulting differences in site amplification (Boore and Thompson, 2007). The R environment has many benefits for scientific and statistical computation; I have chosen R to disseminate these routines because it is versatile enough to program specialized routines, is highly interactive which aids in the analysis of data, and is freely and conveniently available to install on a wide variety of computer platforms. These scripts are useful for the interpretation of layered velocity models from surface-source downhole seismic data such as deep boreholes and SCPT data. The inputs are the travel-time data and the offset of the source at the surface. The travel-time arrivals for the P- and S-waves must already be picked from the original data. An option in the inversion is to include estimates of the standard deviation of the travel-time picks for a weighted inversion of the velocity profile. The standard deviation of each travel-time pick is defined relative to the standard deviation of the best pick in a profile and is based on the accuracy with which the travel-time measurement could be determined from the seismogram. The analysis of the travel-time data consists of two parts: the identification of layer-interfaces, and the

  2. A unified methodology for seismic waveform analysis and inversion

    NASA Astrophysics Data System (ADS)

    Chen, Po

    A central problem of seismology is the inversion of regional waveform data for models of earthquake sources and earth structure. In regions such as Southern California, preliminary 3D earth models are already available, and efficient numerical methods have been developed for solving the point-source forward problem. We describe a unified inversion procedure that utilizes these capabilities to improve 3D earth models and derive centroid moment tensor (CMT) or finite moment tensor (FMT) representations of earthquake ruptures. Our data are time- and frequency-localized measurements of the phase and amplitude anomalies relative to synthetic seismograms computed from reference seismic source and structure models. Our analysis on these phase and amplitude measurements shows that these preliminary 3D models provide substantially better fit to observed data than either laterally homogeneous or path-averaged 1D structure models that are commonly used in previous seismic studies for Southern California. And we found a small but statistically significant polarization anisotropy in the upper crust that might be associated with basin layering effect. Using the same type of phase and amplitude measurements, we resolved finite source properties for about 40 earthquakes in the Los Angeles basin area. Our results on a cluster of events in the Yorba Linda area show left-lateral faulting conjugate to the nearby right-lateral Whittier fault and are consistent with the "escaping-block" hypothesis about regional tectonics around Los Angeles basin. Our analysis on 16 events in a seismicity trend that extends southwest from Fotana to Puente Hills show right-lateral mechanism that is conjugate to the trend of the hypocenter distribution, suggesting a developing weak-zone that might be related to such "escaping" deformation. To set up the structural inverse problem, we computed 3D sensitivity kernels for our phase and amplitude measurements using the 3D SCEC CVM as the reference model and

  3. Analysis of Treasure Island earthquake data using seismic interferometry

    NASA Astrophysics Data System (ADS)

    Mehta, K.; Snieder, R.; Graizer, V.

    2005-12-01

    Seismic interferometry is a powerful tool in extracting the response of ground motion. We show the use of seismic interferometry for analysis of an earthquake recorded by Treasure Island Geotechnical Array near San Francisco, California on 06/26/94. It was a magnitude 4.0 earthquake located at a depth of 6.6 km and distance of 12.6 km from the sensors in borehole. There were six 3-component sensors located at different depths. This problem is similar to the analysis by Snieder and Safak for the Robert A. Millikan Library in Pasadena, California where they deconvolve the recorded wavefield at each of the library floors with the top floor to see the upgoing and the downgoing waves and using that, estimate a shear velocity and a quality factor. They have also shown that for such applications of seismic interferometry, deconvolution of waveforms is superior to correlation. For the Treasure Island data, deconvolving the vertical component of the wavefield for each sensors with the sensor at the surface gives a similar superposition of an upgoing and a downgoing wave. The velocity of these waves agrees well with the compressional wave velocity. We compute the radial and the transverse components. When we window the shear wave arrivals in transverse components at each depth and deconvolve with the one on the surface, the resultant up and down going waves travel with the shear wave velocity. Similar windowing and deconvolution for the radial component also agrees with the shear wave velocity. However, when the radial component is windowed around the compressional waves and deconvolved, the up and the down going waves travel with the shear wave velocity. In the absence of any P to S conversion, the deconvolved waves should be travelling with compressional wave velocity. This suggests that there is a conversion at a depth below the deepest sensor. Receiver functions, defined as the spectral ratio of the radial component with vertical component, can be used to characterize

  4. Energetic analysis of the white light emission associated to seismically active flares in solar cycle 24

    NASA Astrophysics Data System (ADS)

    Buitrago-Casas, Juan Camilo; Martinez Oliveros, Juan Carlos; Glesener, Lindsay; Krucker, Sam

    2014-06-01

    Solar flares are explosive phenomena, thought to be driven by magnetic free energy accumulated in the solar corona. Some flares release seismic transients, "sunquakes", into the Sun's interior. Different mechanisms are being considered to explain how sunquakes are generated. We are conducting an analysis of white-light emission associated with those seismically active solar flares that have been reported by different authors within the current solar cycle. Seismic diagnostics are based upon standard time-distance techniques, including seismic holography, applied to Dopplergrams obtained by SDO/HMI and GONG. The relation between white-light emissions and seismic activity may provide important information on impulsive chromospheric heating during flares, a prospective contributor to seismic transient emission, at least in some instances. We develop a method to get an estimation of Energy associated whit white-light emission and compare those results whit values of energy needed to generate a sunquake according with holographic helioseismology techniques.

  5. Interdependent networks: vulnerability analysis and strategies to limit cascading failure

    NASA Astrophysics Data System (ADS)

    Fu, Gaihua; Dawson, Richard; Khoury, Mehdi; Bullock, Seth

    2014-07-01

    Network theory is increasingly employed to study the structure and behaviour of social, physical and technological systems — including civil infrastructure. Many of these systems are interconnected and the interdependencies between them allow disruptive events to propagate across networks, enabling damage to spread far beyond the immediate footprint of disturbance. In this research we experiment with a model to characterise the configuration of interdependencies in terms of direction, redundancy, and extent, and we analyse the performance of interdependent systems with a wide range of possible coupling modes. We demonstrate that networks with directed dependencies are less robust than those with undirected dependencies, and that the degree of redundancy in inter-network dependencies can have a differential effect on robustness depending on the directionality of the dependencies. As interdependencies between many real-world systems exhibit these characteristics, it is likely that many such systems operate near their critical thresholds. The vulnerability of an interdependent network is shown to be reducible in a cost effective way, either by optimising inter-network connections, or by hardening high degree nodes. The results improve understanding of the influence of interdependencies on system performance and provide insight into how to mitigate associated risks.

  6. Characterization of Soil Deposits for Seismic Response Analysis

    NASA Astrophysics Data System (ADS)

    Lo Presti, Diego; Pallara, Oronzo; Mensi, Elena

    The paper critically reviews in situ and laboratory testing methods used to characterize soil deposits for seismic response analyses. Cyclic loading triaxial tests (CLTX), Cyclic loading torsional shear tests (CLTST) and Resonant column tests (RCT) are considered. As for the in situ testing, geophysical seismic tests and dynamic penetration tests are discussed. Influence of ground conditions on seismic response analyses in a number of real cases is shown. The database made available by the Regional Government of Tuscany (RT) has been used.

  7. Data Quality Analysis for the Bighorn Arch Seismic Array Experiment

    NASA Astrophysics Data System (ADS)

    Mancinelli, N. J.; Yang, Z.; Yeck, W. L.; Sheehan, A. F.

    2010-12-01

    We analyze background noise to assess the difference in station noise levels of different types of seismic sensors and the effects of deployed site locations, and to identify local noise sources, using the data from the Bighorn Arch Seismic Experiment (BASE). Project BASE is an EarthScope Flexible Array (FA) project and includes the deployment of 38 broadband seismometers (Guralp CMG3T), 173 short-period seismometers (L22 and CMG40T-1s), and 1850 high-frequency geophones with Reftek RT125 “Texans” in northern Wyoming, providing continuous dataset of various seismic sensor types and site locations in different geologic setups (basins and mountains). We carry out our analysis through a recently developed approach of using probability density function (PDF) to display the distribution of seismic power spectral density (PSD) [McNamara and Buland, 2004]. This new approach bypasses the tedious pre-screening for transient signals (earthquakes, mass recentering, calibration pulses, etc.) which is required by the traditional PSD analysis. Using the program PQLX, we were able to correlate specific noise sources—mine blasts, teleseisms, passing cars, etc—with features seen on PDF plots. We analyzed eight months of continuous BASE project broadband and short period data for this study. The power spectral density plots suggest that, of the 3 different instrument types used in the BASE project, the broadband CMG3T stations have the lowest background noise in the period range of 0.1-1 s while the short-period L22 stations have the highest background noise. As expected, stations located in the Bighorn Mountain Range are closer to the Low Noise Model [Peterson, 1993] than those located in the adjacent Bighorn Basin and Powder River Basin, particularly in the 0.1-1 s period range. This is mainly attributed to proximity to bedrock, though increased distance from cultural noise also contributes. At longer periods (1-100 s), the noise level of broadband instruments is lower

  8. The Use of GIS for the Application of the Phenomenological Approach to the Seismic Risk Analysis: the Case of the Italian Fortified Architecture

    NASA Astrophysics Data System (ADS)

    Lenticchia, E.; Coïsson, E.

    2017-05-01

    The present paper proposes the use of GIS for the application of the so-called phenomenological approach to the analysis of the seismic behaviour of historical buildings. This approach is based on the awareness that the different masonry building typologies are characterized by different, recurring vulnerabilities. Thus, the observation and classification of the real damage is seen as the first step for recognizing and classifying these vulnerabilities, in order to plan focused preventive interventions. For these purposes, the GIS has proven to be a powerful instrument to collect and manage this type of information on a large number of cases. This paper specifically focuses on the application of the phenomenological approach to the analysis of the seismic behaviour of fortified buildings, including castles, fortresses, citadels, and all the typical historical constructions characterized by the presence of massive towers and defensive walls. The main earthquakes which struck Italy in the last 40 years (up to the recent Central Italy seismic swarm) were taken into consideration and described by means of shake maps. A previously published work has been continued with the addition of new data and some improvements, including a specific symbology for the description of building typologies and conservation status on the maps, the indications of damage levels and the comparison between shake maps in terms of pga and in terms of pseudo-acceleration. The increase in knowledge obtained and the broader frame given by the analysis of the data are here directed to the primary aim of cultural heritage preservation.

  9. Storey building early monitoring based on rapid seismic response analysis

    NASA Astrophysics Data System (ADS)

    Julius, Musa, Admiral; Sunardi, Bambang; Rudyanto, Ariska

    2016-05-01

    Within the last decade, advances in the acquisition, processing and transmission of data from seismic monitoring has contributed to the growth in the number structures instrumented with such systems. An equally important factor for such growth can be attributed to the demands by stakeholders to find rapid answers to important questions related to the functionality or state of "health" of structures during and immediately of a seismic events. Consequently, this study aims to monitor the storey building based on seismic response i. e. earthquake and tremor analysis at short time lapse using accelerographs data. This study used one of storey building (X) in Jakarta city that suffered the effects of Kebumen earthquake January 25th 2014, Pandeglang earthquake July 9th 2014, and Lebak earthquake November 8th 2014. Tremors used in this study are tremors after the three following earthquakes. Data processing used to determine peak ground acceleration (PGA), peak ground velocity (PGV), peak ground displacement (PGD), spectral acceleration (SA), spectral velocity (SV), spectral displacement (SD), A/V ratio, acceleration amplification and effective duration (te). Then determine the natural frequency (f0) and peak of H/V ratio using H/V ratio method.The earthquakes data processing result shows the value of peak ground motion, spectrum response, A/V ratio and acceleration amplification increases with height, while the value of the effective duration give a different viewpoint of building dynamic because duration of Kebumen earthquake shows the highest energy in the highest floor but Pandeglang and Lebak earthquake in the lowest floor. Then, tremors data processing result one month after each earthquakes shows the natural frequency of building in constant value. Increasing of peak ground motion, spectrum response, A/V ratio, acceleration amplification, then decrease of effective duration following the increase of building floors shows that the building construction supports the

  10. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    SciTech Connect

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele

    2008-07-08

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  11. One-dimensional Seismic Analysis of a Solid-Waste Landfill

    NASA Astrophysics Data System (ADS)

    Castelli, Francesco; Lentini, Valentina; Maugeri, Michele

    2008-07-01

    Analysis of the seismic performance of solid waste landfill follows generally the same procedures for the design of embankment dams, even if the methods and safety requirements should be different. The characterization of waste properties for seismic design is difficult due the heterogeneity of the material, requiring the procurement of large samples. The dynamic characteristics of solid waste materials play an important role on the seismic response of landfill, and it also is important to assess the dynamic shear strengths of liner materials due the effect of inertial forces in the refuse mass. In the paper the numerical results of a dynamic analysis are reported and analysed to determine the reliability of the common practice of using 1D analysis to evaluate the seismic response of a municipal solid-waste landfill. Numerical results indicate that the seismic response of a landfill can vary significantly due to reasonable variations of waste properties, fill heights, site conditions, and design rock motions.

  12. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    SciTech Connect

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  13. Detecting Seismic Activity with a Covariance Matrix Analysis of Data Recorded on Seismic Arrays

    NASA Astrophysics Data System (ADS)

    Seydoux, L.; Shapiro, N.; de Rosny, J.; Brenguier, F.

    2014-12-01

    Modern seismic networks are recording the ground motion continuously all around the word, with very broadband and high-sensitivity sensors. The aim of our study is to apply statistical array-based approaches to processing of these records. We use the methods mainly brought from the random matrix theory in order to give a statistical description of seismic wavefields recorded at the Earth's surface. We estimate the array covariance matrix and explore the distribution of its eigenvalues that contains information about the coherency of the sources that generated the studied wavefields. With this approach, we can make distinctions between the signals generated by isolated deterministic sources and the "random" ambient noise. We design an algorithm that uses the distribution of the array covariance matrix eigenvalues to detect signals corresponding to coherent seismic events. We investigate the detection capacity of our methods at different scales and in different frequency ranges by applying it to the records of two networks: (1) the seismic monitoring network operating on the Piton de la Fournaise volcano at La Réunion island composed of 21 receivers and with an aperture of ~15 km, and (2) the transportable component of the USArray composed of ~400 receivers with ~70 km inter-station spacing.

  14. Seismic Canvas: Evolution as a Data Exploration and Analysis Tool

    NASA Astrophysics Data System (ADS)

    Kroeger, G. C.

    2015-12-01

    SeismicCanvas, originally developed as a prototype interactive waveform display and printing application for educational use has evolved to include significant data exploration and analysis functionality. The most recent version supports data import from a variety of standard file formats including SAC and mini-SEED, as well as search and download capabilities via IRIS/FDSN Web Services. Data processing tools now include removal of means and trends, interactive windowing, filtering, smoothing, tapering, resampling. Waveforms can be displayed in a free-form canvas or as a record section based on angular or great circle distance, azimuth or back azimuth. Integrated tau-p code allows the calculation and display of theoretical phase arrivals from a variety of radial Earth models. Waveforms can be aligned by absolute time, event time, picked or theoretical arrival times and can be stacked after alignment. Interactive measurements include means, amplitudes, time delays, ray parameters and apparent velocities. Interactive picking of an arbitrary list of seismic phases is supported. Bode plots of amplitude and phase spectra and spectrograms can be created from multiple seismograms or selected windows of seismograms. Direct printing is implemented on all supported platforms along with output of high-resolution pdf files. With these added capabilities, the application is now being used as a data exploration tool for research. Coded in C++ and using the cross-platform Qt framework, the most recent version is available as a 64-bit application for Windows 7-10, Mac OS X 10.6-10.11, and most distributions of Linux, and a 32-bit version for Windows XP and 7. With the latest improvements and refactoring of trace display classes, the 64-bit versions have been tested with over 250 million samples and remain responsive in interactive operations. The source code is available under a LPGLv3 license and both source and executables are available through the IRIS SeisCode repository.

  15. Letter report seismic shutdown system failure mode and effect analysis

    SciTech Connect

    KECK, R.D.

    1999-09-01

    The Supply Ventilation System Seismic Shutdown ensures that the 234-52 building supply fans, the dry air process fans and vertical development calciner are shutdown following a seismic event. This evaluates the failure modes and determines the effects of the failure modes.

  16. Social vulnerability assessment using spatial multi-criteria analysis (SEVI model) and the Social Vulnerability Index (SoVI model) - a case study for Bucharest, Romania

    NASA Astrophysics Data System (ADS)

    Armaş, I.; Gavriş, A.

    2013-06-01

    In recent decades, the development of vulnerability frameworks has enlarged the research in the natural hazards field. Despite progress in developing the vulnerability studies, there is more to investigate regarding the quantitative approach and clarification of the conceptual explanation of the social component. At the same time, some disaster-prone areas register limited attention. Among these, Romania's capital city, Bucharest, is the most earthquake-prone capital in Europe and the tenth in the world. The location is used to assess two multi-criteria methods for aggregating complex indicators: the social vulnerability index (SoVI model) and the spatial multi-criteria social vulnerability index (SEVI model). Using the data of the 2002 census we reduce the indicators through a factor analytical approach to create the indices and examine if they bear any resemblance to the known vulnerability of Bucharest city through an exploratory spatial data analysis (ESDA). This is a critical issue that may provide better understanding of the social vulnerability in the city and appropriate information for authorities and stakeholders to consider in their decision making. The study emphasizes that social vulnerability is an urban process that increased in a post-communist Bucharest, raising the concern that the population at risk lacks the capacity to cope with disasters. The assessment of the indices indicates a significant and similar clustering pattern of the census administrative units, with an overlap between the clustering areas affected by high social vulnerability. Our proposed SEVI model suggests adjustment sensitivity, useful in the expert-opinion accuracy.

  17. A new approach for computing a flood vulnerability index using cluster analysis

    NASA Astrophysics Data System (ADS)

    Fernandez, Paulo; Mourato, Sandra; Moreira, Madalena; Pereira, Luísa

    2016-08-01

    A Flood Vulnerability Index (FloodVI) was developed using Principal Component Analysis (PCA) and a new aggregation method based on Cluster Analysis (CA). PCA simplifies a large number of variables into a few uncorrelated factors representing the social, economic, physical and environmental dimensions of vulnerability. CA groups areas that have the same characteristics in terms of vulnerability into vulnerability classes. The grouping of the areas determines their classification contrary to other aggregation methods in which the areas' classification determines their grouping. While other aggregation methods distribute the areas into classes, in an artificial manner, by imposing a certain probability for an area to belong to a certain class, as determined by the assumption that the aggregation measure used is normally distributed, CA does not constrain the distribution of the areas by the classes. FloodVI was designed at the neighbourhood level and was applied to the Portuguese municipality of Vila Nova de Gaia where several flood events have taken place in the recent past. The FloodVI sensitivity was assessed using three different aggregation methods: the sum of component scores, the first component score and the weighted sum of component scores. The results highlight the sensitivity of the FloodVI to different aggregation methods. Both sum of component scores and weighted sum of component scores have shown similar results. The first component score aggregation method classifies almost all areas as having medium vulnerability and finally the results obtained using the CA show a distinct differentiation of the vulnerability where hot spots can be clearly identified. The information provided by records of previous flood events corroborate the results obtained with CA, because the inundated areas with greater damages are those that are identified as high and very high vulnerability areas by CA. This supports the fact that CA provides a reliable FloodVI.

  18. Vulnerability of Thai rice production to simultaneous climate and socioeconomic changes: a double exposure analysis

    NASA Astrophysics Data System (ADS)

    Sangpenchan, R.

    2011-12-01

    This research explores the vulnerability of Thai rice production to simultaneous exposure by climate and socioeconomic change -- so-called "double exposure." Both processes influence Thailand's rice production system, but the vulnerabilities associated with their interactions are unknown. To understand this double exposure, I adopts a mixed-method, qualitative-quantitative analytical approach consisting of three phases of analysis involving a Vulnerability Scoping Diagram, a Principal Component Analysis, and the EPIC crop model using proxy datasets collected from secondary data sources at provincial scales.The first and second phases identify key variables representing each of the three dimensions of vulnerability -- exposure, sensitivity, and adaptive capacity indicating that the greatest vulnerability in the rice production system occurs in households and areas with high exposure to climate change, high sensitivity to climate and socioeconomic stress, and low adaptive capacity. In the third phase, the EPIC crop model simulates rice yields associated with future climate change projected by CSIRO and MIROC climate models. Climate change-only scenarios project the decrease in yields by 10% from the current productivity during 2016-2025 and 30% during 2045-2054. Scenarios applying both climate change and improved technology and management practices show that a 50% increase in rice production is possible, but requires strong collaboration between sectors to advance agricultural research and technology and requires strong adaptive capacity in the rice production system characterized by well-developed social capital, social networks, financial capacity, and infrastructure and household mobility at the local scale. The vulnerability assessment and climate and crop adaptation simulations used here provide useful information to decision makers developing vulnerability reduction plans in the face of concurrent climate and socioeconomic change.

  19. Pembina Cardium CO2-EOR monitoring project: Integrated surface seismic and VSP time-lapse seismic analysis

    NASA Astrophysics Data System (ADS)

    Alshuhail, A. A.

    2009-12-01

    In the Pembina field in west-central Alberta, Canada, approximately 40,000 tons of supercritical CO2 was injected into the 1650 m deep, 20 m thick upper-Cretaceous Cardium Fm. between March 2005 and 2007. A time-lapse seismic program was designed and incorporated into the overall measurement, monitoring and verification program. The objectives were to track the CO2 plume within the reservoir, and to evaluate the integrity of storage. Fluid replacement modeling predicts a decrease in the P-wave velocity and bulk density in the reservoir by about 4% and 1%, respectively. Synthetic seismograms show subtle reflectivity changes at the Cardium Fm. and a traveltime delay at the later high-amplitude Viking event of less than 1 ms. The time-lapse datasets, however, show no significant anomalies in the P-wave seismic data that can be attributed to supercritical CO2 injected into the Cardium Fm. (Figure 1). The converted-wave (P-S) data, on the other hand, showed small traveltime anomalies. The most coherent results were those obtained by the fixed-array VSP dataset (Figure 2) due to higher frequency bandwidth and high signal to noise ratio. The amplitude and traveltime changes observed in the VSP dataset are small but are consistent in magnitude with those predicted from rock physics modeling. The analysis suggests that the inability to clearly detect the CO2 plume in surface seismic data is likely due to the CO2 being contained in thin permeable sandstone members of the Cardium Formation. The seismic signature of the Cardium Fm. in this area may also be degraded by multiples and strong attenuation involving the shallow Ardley coals. However, the lack of a 4D seismic changes above the reservoir indicates that the injected CO2 is not migrating through the caprock into shallower formations.

  20. Analysis of seismic noise recorded by temporary seismic array near the Pyhäsalmi underground mine in Finland

    NASA Astrophysics Data System (ADS)

    Afonin, Nikita; Kozlovskaya, Elena; Narkilahti, Janne; Nevalainen, Jouni

    2016-04-01

    The Pyhäsalmi mine is an underground copper and zinc mine located in central Finland. It is one of the oldest and deepest underground mines in Europe, in which ore is excavated from the depth of about 1450 m. Due to the large amount of heavy machinery, the mine itself is a source of strong seismic and acoustic noise. This continuous noise creates a problem for high-resolution active source seismic experiments. That is why in out study we investigated the opportunity to use this seismic noise for studying structure of the uppermost crust. For this we installed 24 3-component DSU-SA MEMS seismic sensors with the autonomous RAUD eX data acquisition units produced by Sercel Ltd. along a 10 km long line crossing the mine area. The array recorded continuous seismic data from 29.10.2013 to 1.11.2013 with the sampling rate of 500 sps. The continuous data for the period 5 days were processed in several steps including single station data analysis, pre-filtering and time-domain stacking. The processed data set was used to estimate empirical Green's functions (EGF) between pairs of stations in the frequency band of 1-100 Hz. We developed our own procedure of stacking EGF in time-domain and, as a result, we were able to extract not only Rayleigh, but also refracted P-waves. Finally, we calculated surface wave dispersion curves and solved inversion problems for surface waves and refracted waves. In our paper we concentrate mainly on details of our data processing routine and its influence on quality of results of EGF extraction. The study is a part of SEISLAB project funded by he European Regional Development Fund (ERDF), Council of Oulu region (Finland) and Pyhäsalmi Mine Oy.

  1. Reservoir lithofacies analysis using 3D seismic data in dissimilarity space

    NASA Astrophysics Data System (ADS)

    Bagheri, M.; Riahi, M. A.; Hashemi, H.

    2013-06-01

    Seismic data interpretation is one of the most important steps in exploration seismology. Seismic facies analysis (SFA) with emphasis on lithofacies can be used to extract more information about structures and geology, which results in seismic interpretation enhancement. Facies analysis is based on unsupervised and supervised classification using seismic attributes. In this paper, supervised classification by a support vector machine using well logs and seismic attributes is applied. Dissimilarity as a new measuring space is employed, after which classification is carried out. Often, SFA is carried out in a feature space in which each dimension stands as a seismic attribute. Different facies show lots of class overlap in the feature space; hence, high classification error values are reported. Therefore, decreasing class overlap before classification is a necessary step to be targeted. To achieve this goal, a dissimilarity space is initially created. As a result of the definition of the new space, the class overlap between objects (seismic samples) is reduced and hence the classification can be done reliably. This strategy causes an increase in the accuracy of classification, and a more trustworthy lithofacies analysis is attained. For applying this method, 3D seismic data from an oil field in Iran were selected and the results obtained by a support vector classifier (SVC) in dissimilarity space are presented, discussed and compared with the SVC applied in conventional feature space.

  2. Single-block rockfall dynamics inferred from seismic signal analysis

    NASA Astrophysics Data System (ADS)

    Hibert, Clément; Malet, Jean-Philippe; Bourrier, Franck; Provost, Floriane; Berger, Frédéric; Bornemann, Pierrick; Tardif, Pascal; Mermin, Eric

    2017-05-01

    Seismic monitoring of mass movements can significantly help to mitigate the associated hazards; however, the link between event dynamics and the seismic signals generated is not completely understood. To better understand these relationships, we conducted controlled releases of single blocks within a soft-rock (black marls) gully of the Rioux-Bourdoux torrent (French Alps). A total of 28 blocks, with masses ranging from 76 to 472 kg, were used for the experiment. An instrumentation combining video cameras and seismometers was deployed along the travelled path. The video cameras allow reconstructing the trajectories of the blocks and estimating their velocities at the time of the different impacts with the slope. These data are compared to the recorded seismic signals. As the distance between the falling block and the seismic sensors at the time of each impact is known, we were able to determine the associated seismic signal amplitude corrected for propagation and attenuation effects. We compared the velocity, the potential energy lost, the kinetic energy and the momentum of the block at each impact to the true amplitude and the radiated seismic energy. Our results suggest that the amplitude of the seismic signal is correlated to the momentum of the block at the impact. We also found relationships between the potential energy lost, the kinetic energy and the seismic energy radiated by the impacts. Thanks to these relationships, we were able to retrieve the mass and the velocity before impact of each block directly from the seismic signal. Despite high uncertainties, the values found are close to the true values of the masses and the velocities of the blocks. These relationships allow for gaining a better understanding of the physical processes that control the source of high-frequency seismic signals generated by rockfalls.

  3. Dynamic behavior of ground for seismic analysis of lifeline systems

    SciTech Connect

    Sato, T.; Derkiurgehian, A.

    1982-01-01

    A mathematical formula is derived for the general wave transfer function in multilayered media with inhomogeneous and nonlinear properties of soil. It is assumed that the ground consists of horizontally stratified layers overlying a homogeneous half space which is excited by vertically incident, plane shear waves. To formulate the nonlinear harmonic wave solution, the surface layer is regarded as a multilayered system consisting of infinite numbers of sublayers with infinitesimal thicknesses. The mode superposition procedure based on response spectrum provides an expedient tool for dynamic analysis of surficial ground. The characteristic equation for obtaining natural frequencies and free vibration modes is derived by using the proposed wave transfer function. To use the modal analysis for nonlinear systems, a repetition scheme for calculating the model stiffness and damping is proposed which is an adaptation of the equivalent linearization technique. The estimation of intensity of ground shaking is based on a response spectrum for stationary random vibration analysis. The results in conjunction with fatigue theory are used to study the liquefaction problem in soil layers with general topography. Application of the proposed methods in seismic reliability assessment of lifeline systems is discussed.

  4. Seismic signature analysis for discrimination of people from animals

    NASA Astrophysics Data System (ADS)

    Damarla, Thyagaraju; Mehmood, Asif; Sabatier, James M.

    2013-05-01

    Cadence analysis has been the main focus for discriminating between the seismic signatures of people and animals. However, cadence analysis fails when multiple targets are generating the signatures. We analyze the mechanism of human walking and the signature generated by a human walker, and compare it with the signature generated by a quadruped. We develop Fourier-based analysis to differentiate the human signatures from the animal signatures. We extract a set of basis vectors to represent the human and animal signatures using non-negative matrix factorization, and use them to separate and classify both the targets. Grazing animals such as deer, cows, etc., often produce sporadic signals as they move around from patch to patch of grass and one must characterize them so as to differentiate their signatures from signatures generated by a horse steadily walking along a path. These differences in the signatures are used in developing a robust algorithm to distinguish the signatures of animals from humans. The algorithm is tested on real data collected in a remote area.

  5. Numerical analysis on seismic response of Shinkansen bridge-train interaction system under moderate earthquakes

    NASA Astrophysics Data System (ADS)

    He, Xingwen; Kawatani, Mitsuo; Hayashikawa, Toshiro; Matsumoto, Takashi

    2011-03-01

    This study is intended to evaluate the influence of dynamic bridge-train interaction (BTI) on the seismic response of the Shinkansen system in Japan under moderate earthquakes. An analytical approach to simulate the seismic response of the BTI system is developed. In this approach, the behavior of the bridge structure is assumed to be within the elastic range under moderate ground motions. A bullet train car model idealized as a sprung-mass system is established. The viaduct is modeled with 3D finite elements. The BTI analysis algorithm is verified by comparing the analytical and experimental results. The seismic analysis is validated through comparison with a general program. Then, the seismic responses of the BTI system are simulated and evaluated. Some useful conclusions are drawn, indicating the importance of a proper consideration of the dynamic BTI in seismic design.

  6. Exploring drought vulnerability in Africa: an indicator based analysis to be used in early warning systems

    NASA Astrophysics Data System (ADS)

    Naumann, G.; Barbosa, P.; Garrote, L.; Iglesias, A.; Vogt, J.

    2014-05-01

    We propose a composite drought vulnerability indicator (DVI) that reflects different aspects of drought vulnerability evaluated at Pan-African level for four components: the renewable natural capital, the economic capacity, the human and civic resources, and the infrastructure and technology. The selection of variables and weights reflects the assumption that a society with institutional capacity and coordination, as well as with mechanisms for public participation, is less vulnerable to drought; furthermore, we consider that agriculture is only one of the many sectors affected by drought. The quality and accuracy of a composite indicator depends on the theoretical framework, on the data collection and quality, and on how the different components are aggregated. This kind of approach can lead to some degree of scepticism; to overcome this problem a sensitivity analysis was done in order to measure the degree of uncertainty associated with the construction of the composite indicator. Although the proposed drought vulnerability indicator relies on a number of theoretical assumptions and some degree of subjectivity, the sensitivity analysis showed that it is a robust indicator and hence able of representing the complex processes that lead to drought vulnerability. According to the DVI computed at country level, the African countries classified with higher relative vulnerability are Somalia, Burundi, Niger, Ethiopia, Mali and Chad. The analysis of the renewable natural capital component at sub-basin level shows that the basins with high to moderate drought vulnerability can be subdivided into the following geographical regions: the Mediterranean coast of Africa; the Sahel region and the Horn of Africa; the Serengeti and the Eastern Miombo woodlands in eastern Africa; the western part of the Zambezi Basin, the southeastern border of the Congo Basin, and the belt of Fynbos in the Western Cape province of South Africa. The results of the DVI at the country level were

  7. Array analysis methods for detection, classification and location of seismic sources: a first evaluation for aftershock analysis using dense temporary post-seismic array network

    NASA Astrophysics Data System (ADS)

    Poiata, N.; Satriano, C.; Vilotte, J.; Bernard, P.

    2012-12-01

    Detection, separation, classification and location of distributed non stationary seismic sources in broadband noisy environment is an important problem in seismology, in particular for monitoring the high-level post-seismic activity following large subduction earthquakes, like the off-shore Maule (Mw 8.8, 2010) earthquake in Central Chile. Multiple seismic arrays, and local antenna, distributed over a region allow exploiting frequency selective coherence of the signals that arrive at widely-separated array stations, leading to improved detection, convolution blind source separation, and location of distributed non stationary sources. We present here first results on the investigation of time-frequency adaptive array analysis techniques for detection and location of broadband distributed seismic events recorded by the dense temporary seismic network (International Maule Aftershock Deployment, IMAD) installed for monitoring the high-level seismic activity following the 27 February 2010 Maule earthquake (Mw 8.8). This seismic network is characterized by a large aperture, with variable inter-station distances, corroborated with a high level of distributed near and far field seismic source activity and noise. For this study, we first extract from the post-seismic network a number of seismic arrays distributed over the region covered by this network. A first aspect is devoted to passive distributed seismic sources detection, classification and separation. We investigate a number of narrow and wide band signal analysis methods both in time and time-frequency domains for energy arrival detection and tracking, including time adaptive higher order statistics, e.g. like kurtosis, and multiband band-pass filtering, together with adaptive time-frequency transformation and extraction techniques. We demonstrate that these techniques provide superior resolution and robustness than classical STA/LTA techniques in particular in the case of distributed sources with potential signal

  8. Developing Vulnerability Analysis Method for Climate Change Adaptation on Agropolitan Region in Malang District

    NASA Astrophysics Data System (ADS)

    Sugiarto, Y.; Perdinan; Atmaja, T.; Wibowo, A.

    2017-03-01

    Agriculture plays a strategic role in strengthening sustainable development. Based on agropolitan concept, the village becomes the center of economic activities by combining agriculture, agro-industry, agribusiness and tourism that able to create high value-added economy. The impact of climate change on agriculture and water resources may increase the pressure on agropolitan development. The assessment method is required to measure the vulnerability of area-based communities in the agropolitan to climate change impact. An analysis of agropolitan vulnerability was conducted in Malang district based on four aspects and considering the availability and distribution of water as the problem. The indicators used to measure was vulnerability component which consisted of sensitivity and adaptive capacity and exposure component. The studies earned 21 indicators derived from the 115 village-based data. The results of vulnerability assessments showed that most of the villages were categorised at a moderate level. Around 20% of 388 villages were categorized at high to very high level of vulnerability due to low level of agricultural economic. In agropolitan region within the sub-district of Poncokusumo, the vulnerability of the villages varies between very low to very high. The most villages were vulnerable due to lower adaptive capacity, eventhough the level of sensitivity and exposure of all villages were relatively similar. The existence of water resources was the biggest contributor to the high exposure of the villages in Malang district, while the reception of credit facilities and source of family income were among the indicators that lead to high sensitivity component.

  9. Regional analysis of earthquake occurrence and seismic energy release

    NASA Technical Reports Server (NTRS)

    Cohen, S. C.

    1980-01-01

    The historic temporal variation in earthquake occurrence and seismic energy release on a regional basis throughtout the world were studied. The regionalization scheme employed divided the world into large areas based either on seismic and tectonic considerations (Flinn-Engdahl Scheme) or geographic (longitude and latitude) criteria. The data set is the wide earthquake catalog of the National Geophysical Solar-Terrestrial Data Center. An apparent relationship exists between the maximum energy released in a limited time within a seismic region and the average or background energy per year averaged over a long time period. In terms of average or peak energy release, the most seismic regions of the world during the 50 to 81 year period ending in 1977 were Japanese, Andean South American, and the Alaska-Aleutian Arc regions. The year to year fluctuations in regional seismic energy release are greater, by orders of magnitude, than the corresponding variations in the world-wide seismic energy release. The b values of seismic regions range from 0.7 to 1.4 where earthquake magnitude is in the range 6.0 to 7.5.

  10. Calibrating Nonlinear Soil Material Properties for Seismic Analysis Using Soil Material Properties Intended for Linear Analysis

    SciTech Connect

    Spears, Robert Edward; Coleman, Justin Leigh

    2015-08-01

    Seismic analysis of nuclear structures is routinely performed using guidance provided in “Seismic Analysis of Safety-Related Nuclear Structures and Commentary (ASCE 4, 1998).” This document, which is currently under revision, provides detailed guidance on linear seismic soil-structure-interaction (SSI) analysis of nuclear structures. To accommodate the linear analysis, soil material properties are typically developed as shear modulus and damping ratio versus cyclic shear strain amplitude. A new Appendix in ASCE 4-2014 (draft) is being added to provide guidance for nonlinear time domain SSI analysis. To accommodate the nonlinear analysis, a more appropriate form of the soil material properties includes shear stress and energy absorbed per cycle versus shear strain. Ideally, nonlinear soil model material properties would be established with soil testing appropriate for the nonlinear constitutive model being used. However, much of the soil testing done for SSI analysis is performed for use with linear analysis techniques. Consequently, a method is described in this paper that uses soil test data intended for linear analysis to develop nonlinear soil material properties. To produce nonlinear material properties that are equivalent to the linear material properties, the linear and nonlinear model hysteresis loops are considered. For equivalent material properties, the shear stress at peak shear strain and energy absorbed per cycle should match when comparing the linear and nonlinear model hysteresis loops. Consequently, nonlinear material properties are selected based on these criteria.

  11. Sampling and Analysis Plan Waste Treatment Plant Seismic Boreholes Project.

    SciTech Connect

    Brouns, Thomas M.

    2007-07-15

    This sampling and analysis plan (SAP) describes planned data collection activities for four entry boreholes through the sediment overlying the Saddle Mountains Basalt, up to three new deep rotary boreholes through the Saddle Mountains Basalt and sedimentary interbeds, and one corehole through the Saddle Mountains Basalt and sedimentary interbeds at the Waste Treatment Plant (WTP) site. The SAP will be used in concert with the quality assurance plan for the project to guide the procedure development and data collection activities needed to support borehole drilling, geophysical measurements, and sampling. This SAP identifies the American Society of Testing Materials standards, Hanford Site procedures, and other guidance to be followed for data collection activities. Revision 3 incorporates all interim change notices (ICN) that were issued to Revision 2 prior to completion of sampling and analysis activities for the WTP Seismic Boreholes Project. This revision also incorporates changes to the exact number of samples submitted for dynamic testing as directed by the U.S. Army Corps of Engineers. Revision 3 represents the final version of the SAP.

  12. Analysis of induced seismicity in geothermal reservoirs – An overview

    USGS Publications Warehouse

    Zang, Arno; Oye, Volker; Jousset, Philippe; Deichmann, Nicholas; Gritto, Roland; McGarr, Arthur F.; Majer, Ernest; Bruhn, David

    2014-01-01

    In this overview we report results of analysing induced seismicity in geothermal reservoirs in various tectonic settings within the framework of the European Geothermal Engineering Integrating Mitigation of Induced Seismicity in Reservoirs (GEISER) project. In the reconnaissance phase of a field, the subsurface fault mapping, in situ stress and the seismic network are of primary interest in order to help assess the geothermal resource. The hypocentres of the observed seismic events (seismic cloud) are dependent on the design of the installed network, the used velocity model and the applied location technique. During the stimulation phase, the attention is turned to reservoir hydraulics (e.g., fluid pressure, injection volume) and its relation to larger magnitude seismic events, their source characteristics and occurrence in space and time. A change in isotropic components of the full waveform moment tensor is observed for events close to the injection well (tensile character) as compared to events further away from the injection well (shear character). Tensile events coincide with high Gutenberg-Richter b-values and low Brune stress drop values. The stress regime in the reservoir controls the direction of the fracture growth at depth, as indicated by the extent of the seismic cloud detected. Stress magnitudes are important in multiple stimulation of wells, where little or no seismicity is observed until the previous maximum stress level is exceeded (Kaiser Effect). Prior to drilling, obtaining a 3D P-wave (Vp) and S-wave velocity (Vs) model down to reservoir depth is recommended. In the stimulation phase, we recommend to monitor and to locate seismicity with high precision (decametre) in real-time and to perform local 4D tomography for velocity ratio (Vp/Vs). During exploitation, one should use observed and model induced seismicity to forward estimate seismic hazard so that field operators are in a position to adjust well hydraulics (rate and volume of the

  13. Seismic data interpretation using the Hough transform and principal component analysis

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Martin, R.; Ávila-Carrera, R.; Rodríguez-Castellanos, A.

    2011-03-01

    In this work two novel image processing techniques are applied to detect and delineate complex salt bodies from seismic exploration profiles: Hough transform and principal component analysis (PCA). It is well recognized by the geophysical community that the lack of resolution and poor structural identification in seismic data recorded at sub-salt plays represent severe technical and economical problems. Under such circumstances, seismic interpretation based only on the human-eye is inaccurate. Additionally, petroleum field development decisions and production planning depend on good-quality seismic images that generally are not feasible in salt tectonics areas. In spite of this, morphological erosion, region growing and, especially, a generalization of the Hough transform (closely related to the Radon transform) are applied to build parabolic shapes that are useful in the idealization and recognition of salt domes from 2D seismic profiles. In a similar way, PCA is also used to identify shapes associated with complex salt bodies in seismic profiles extracted from 3D seismic data. To show the validity of the new set of seismic results, comparisons between both image processing techniques are exhibited. It is remarkable that the main contribution of this work is oriented in providing the seismic interpreters with new semi-automatic computational tools. The novel image processing approaches presented here may be helpful in the identification of diapirs and other complex geological features from seismic images. Conceivably, in the near future, a new branch of seismic attributes could be recognized by geoscientists and engineers based on the encouraging results reported here.

  14. Mechanical vulnerability of lower second premolar utilising visco-elastic dynamic stress analysis.

    PubMed

    Khani, M M; Tafazzoli-Shadpour, M; Aghajani, F; Naderi, P

    2009-10-01

    Stress analysis determines vulnerability of dental tissues to external loads. Stress values depend on loading conditions, mechanical properties and constrains of structural components. The critical stress levels lead to tissue damage. The aim of this study is to analyse dynamic stress distribution of lower second premolar due to physiological cyclic loading, and dependency of pulsatile stress characteristics to visco-elastic property of dental components by finite element modelling. Results show that visco-elastic property markedly influences stress determinants in major anatomical sites including dentin, cementum-enamel and dentin-enamel junctions. Reduction of visco-elastic parameter leads to mechanical vulnerability through elevation of stress pulse amplitude, maximum stress value; and reduction of stress phase shift as a determinant of stress wave propagation. The results may be applied in situations in which visco-elasticity is reduced such as root canal therapy and post and core restoration in which teeth are more vulnerable to fracture.

  15. Seismic facies analysis based on self-organizing map and empirical mode decomposition

    NASA Astrophysics Data System (ADS)

    Du, Hao-kun; Cao, Jun-xing; Xue, Ya-juan; Wang, Xing-jian

    2015-01-01

    Seismic facies analysis plays an important role in seismic interpretation and reservoir model building by offering an effective way to identify the changes in geofacies inter wells. The selections of input seismic attributes and their time window have an obvious effect on the validity of classification and require iterative experimentation and prior knowledge. In general, it is sensitive to noise when waveform serves as the input data to cluster analysis, especially with a narrow window. To conquer this limitation, the Empirical Mode Decomposition (EMD) method is introduced into waveform classification based on SOM. We first de-noise the seismic data using EMD and then cluster the data using 1D grid SOM. The main advantages of this method are resolution enhancement and noise reduction. 3D seismic data from the western Sichuan basin, China, are collected for validation. The application results show that seismic facies analysis can be improved and better help the interpretation. The powerful tolerance for noise makes the proposed method to be a better seismic facies analysis tool than classical 1D grid SOM method, especially for waveform cluster with a narrow window.

  16. Analysis of the induced seismicity of the Lacq gas field (Southwestern France) and model of deformation

    NASA Astrophysics Data System (ADS)

    Bardainne, T.; Dubos-Sallée, N.; Sénéchal, G.; Gaillot, P.; Perroud, H.

    2008-03-01

    The goal of this paper is to propose a model of deformation pattern for the Lacq gas field (southwest of France), considering the temporal and spatial evolution of the observed induced seismicity. This model of deformation has been determined from an updating of the earthquake locations and considering theoretical and analogue models usually accepted for hydrocarbon field deformation. The Lacq seismicity is clearly not linked to the natural seismicity of the Pyrenean range recorded 30km farther to the south since the first event was felt in 1969, after the beginning of the hydrocarbon recovery. From 1974 to 1997, more than 2000 local events (ML < 4.2) have been recorded by two permanent local seismic networks. Unlike previously published results focusing on limited time lapse studies, our analysis relies on the data from 1974 to 1997. Greater accuracy of the absolute locations have been obtained using a well adapted algorithm of 3-D location, after improvement of the 3-D P-wave velocity model and determination of specific station corrections for different clusters of events. This updated catalogue of seismicity has been interpreted taking into account the structural context of the gas field. The Lacq gas field is an anticlinal reservoir where 3-D seismic and borehole data reveal a pattern of high density of fracturing, mainly oriented WNW-ESE. Seismicity map and vertical cross-sections show that majority of the seismic events (70 per cent) occurred above the gas reservoir. Correlation is also observed between the orientation of the pre-existent faults and the location of the seismic activity. Strong and organized seismicity occurred where fault orientation is consistent with the poroelastic stress perturbation due to the gas recovery. On the contrary, the seismicity is quiescient where isobaths of the reservoir roof are closed to be perpendicular to the faults. These quiescient areas as well as the central seismic part are characterized by a surface subsidence

  17. The shallow elastic structure of the lunar crust: New insights from seismic wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-10-01

    Enigmatic lunar seismograms recorded during the Apollo 17 mission in 1972 have so far precluded the identification of shear-wave arrivals and hence the construction of a comprehensive elastic model of the shallow lunar subsurface. Here, for the first time, we extract shear-wave information from the Apollo active seismic data using a novel waveform analysis technique based on spatial seismic wavefield gradients. The star-like recording geometry of the active seismic experiment lends itself surprisingly well to compute spatial wavefield gradients and rotational ground motion as a function of time. These observables, which are new to seismic exploration in general, allowed us to identify shear waves in the complex lunar seismograms, and to derive a new model of seismic compressional and shear-wave velocities in the shallow lunar crust, critical to understand its lithology and constitution, and its impact on other geophysical investigations of the Moon's deep interior.

  18. A quantitative analysis of global intermediate and deep seismicity

    NASA Astrophysics Data System (ADS)

    Ruscic, Marija; Becker, Dirk; Le Pourhiet, Laetitita; Agard, Philippe; Meier, Thomas

    2017-04-01

    The seismic activity in subduction zones around the world shows a large spatial variabilty with some regions exhibiting strong seismic activity down to depths of almost 700km while in other places seismicity terminates at depths of about 200 or 300 km. Also the decay of the number of seismic events or of the seismic moment with depth is more pronounced in some regions than in others. The same is true for the variability of the ratio of large to small events (the b-value of the Gutenberg-Richter relation) that is varying with depth. These observations are often linked to parameters of the downgoing plate like age or subduction velocity. In this study we investigate a subset of subduction zones utilizing the revised ISC catalogue of intermediate and deep seismicity to determine statistical parameters well suited to describe properties of intermediate deep and deep events. The seismicity is separated into three depth intervals from 50-175km, 175-400km and >400km based on the depth at which the plate contact decouples, the observed nearly exponential decay of the event rate with depth and the supposed depth of phase transition at 410 km depth where also an increase of the event number with depth is observed. For estimation of the b-value and the exponential decay with depth, a restriction of the investigated time interval to the period after 1997 produced significantly better results indicating a globally homogeneous magnitude scale with the magnitude of completeness of about Mw 5. On a global scale the b-value decreases with depth from values of about 1 at 50-175km to values of slightly below 0.8 for events below 400km. Also, there is a slight increase of the b-value with the age of the subducting plate. These changes in the b-value with depth and with age may indicate a varying fragmentation of the slab. With respect to the ratio of the seismic moment between deeper and shallower parts of the subduction zones a dependence on the age is apparent with older slabs

  19. Discrimination of porosity and fluid saturation using seismic velocity analysis

    DOEpatents

    Berryman, James G.

    2001-01-01

    The method of the invention is employed for determining the state of saturation in a subterranean formation using only seismic velocity measurements (e.g., shear and compressional wave velocity data). Seismic velocity data collected from a region of the formation of like solid material properties can provide relatively accurate partial saturation data derived from a well-defined triangle plotted in a (.rho./.mu., .lambda./.mu.)-plane. When the seismic velocity data are collected over a large region of a formation having both like and unlike materials, the method first distinguishes the like materials by initially plotting the seismic velocity data in a (.rho./.lambda., .mu./.lambda.)-plane to determine regions of the formation having like solid material properties and porosity.

  20. Shallow seismic surface waves analysis across a tectonic fault

    NASA Astrophysics Data System (ADS)

    Gazdova, R.; Vilhelm, J.; Kolinsky, P.

    2011-12-01

    When performing a seismic survey of a shallow medium, we record wave motion which can be excited by a sledge hammer blow on the ground surface. The recorded wave motion is a complex combination of different types of waves, propagating directly from the source to the receiver, reflecting from velocity boundaries, passing through multiple layers or forming dispersive surface waves. We can use all of these wave types to identify the structure of the medium. In the presented contribution we deal with interpretation of surface waves. In contrast with body waves, the surface wave velocity is frequency-dependent. This property is called dispersion, and the dependence of the velocity on the frequency is known as the dispersion curve. The measured dispersion of the surface waves can be used to assess the structural velocity distribution in the layered medium, through which the waves propagate. We analyze surface waves recorded within the geophysical survey of the paleoseismological trench site over the Hluboka tectonic fault, Czech Republic, Central Europe. The surface waves in frequency range 15 - 70 Hz were recorded by the three component geophones with the active (sledge hammer) source. Group velocities are analyzed by the program SVAL which is based on the multiple filtering technique. It is a standard method of the Fourier transform-based frequency-time analysis. The spectrum of each record is multiplied by weighting functions centered at many discrete frequencies. Five local envelope maxima of all quasiharmonic components obtained by the inverse Fourier transform are found and their propagation times determined. These maxima are assigned to different modes of direct surface waves as well as to possible reflected, converted and multipathed modes. Filtered fundamental modes at pairs of geophones are correlated and phase velocities of surface waves are computed from the delays of propagation times of all quasiharmonic components. From the dispersion curves the shear wave

  1. MOBB: Data Analysis from an Ocean Floor Broadband Seismic Observatory

    NASA Astrophysics Data System (ADS)

    Uhrhammer, R. A.; Dolenc, D.; Romanowicz, B.; Stakes, D.; McGill, P.; Neuhauser, D.; Ramirez, T.

    2003-12-01

    MOBB (Monterey bay Ocean floor Broad Band project) is a collaborative project between the Monterey Bay Aquarium Research Institute (MBARI) and the Berkeley Seismological Laboratory (BSL). Its goal is to install and operate a permanent seafloor broadband station as a first step towards extending the on-shore broadband seismic network in northern California, to the seaside of the North-America/Pacific plate boundary, providing improved azimuthal coverage for regional earthquake and structure studies. The MOBB station was installed on the seafloor in Monterey Bay, 40 km offshore, and at a depth of 1000m from the sea surface, in April 2002, and is completely buried under the seafloor level. The installation made use of MBARI's Point Lobos ship and ROV Ventana and the station currently records data autonomously. Dives are scheduled regularly (about every three months) to recover and replace the recording and battery packages. Some data were lost in the first half of 2003 due to hardware and software problems in the recording system. The ocean-bottom MOBB station currently comprises a three-component seismometer package (Guralp CMG-1T), a current-meter, a digital pressure gauge (DPG), and recording and battery packages. The seismometer package is mounted on a cylindrical titanium pressure vessel 54cm in height and 41 cm in diameter, custom built by the MBARI team and outfitted for underwater connection. Since the background noise in the near-shore ocean floor environment is high in the band pass of interest, for the study of regional and teleseismic signals, an important focus of this project is to develop methods to a posteriori increase signal to noise ratios, by deconvolving contributions from various sources of noise. We present results involving analysis of correlation of background noise with tide, ocean current and pressure records, combining data from MOBB and regional land based stations of the Berkeley Digital Seismic Network (BDSN). We also present preliminary

  2. Seismicity monitoring by cluster analysis of moment tensors

    NASA Astrophysics Data System (ADS)

    Cesca, Simone; Şen, Ali Tolga; Dahm, Torsten

    2014-03-01

    We suggest a new clustering approach to classify focal mechanisms from large moment tensor catalogues, with the purpose of automatically identify families of earthquakes with similar source geometry, recognize the orientation of most active faults, and detect temporal variations of the rupture processes. The approach differs in comparison to waveform similarity methods since clusters are detected even if they occur in large spatial distances. This approach is particularly helpful to analyse large moment tensor catalogues, as in microseismicity applications, where a manual analysis and classification is not feasible. A flexible algorithm is here proposed: it can handle different metrics, norms, and focal mechanism representations. In particular, the method can handle full moment tensor or constrained source model catalogues, for which different metrics are suggested. The method can account for variable uncertainties of different moment tensor components. We verify the method with synthetic catalogues. An application to real data from mining induced seismicity illustrates possible applications of the method and demonstrate the cluster detection and event classification performance with different moment tensor catalogues. Results proof that main earthquake source types occur on spatially separated faults, and that temporal changes in the number and characterization of focal mechanism clusters are detected. We suggest that moment tensor clustering can help assessing time dependent hazard in mines.

  3. Seismic Analysis Issues in Design Certification Applications for New Reactors

    SciTech Connect

    Miranda, M.; Morante, R.; Xu, J.

    2011-07-17

    The licensing framework established by the U.S. Nuclear Regulatory Commission under Title 10 of the Code of Federal Regulations (10 CFR) Part 52, “Licenses, Certifications, and Approvals for Nuclear Power Plants,” provides requirements for standard design certifications (DCs) and combined license (COL) applications. The intent of this process is the early reso- lution of safety issues at the DC application stage. Subsequent COL applications may incorporate a DC by reference. Thus, the COL review will not reconsider safety issues resolved during the DC process. However, a COL application that incorporates a DC by reference must demonstrate that relevant site-specific de- sign parameters are within the bounds postulated by the DC, and any departures from the DC need to be justified. This paper provides an overview of several seismic analysis issues encountered during a review of recent DC applications under the 10 CFR Part 52 process, in which the authors have participated as part of the safety review effort.

  4. Pre-stack-texture-based reservoir characteristics and seismic facies analysis

    NASA Astrophysics Data System (ADS)

    Song, Cheng-Yun; Liu, Zhi-Ning; Cai, Han-Peng; Qian, Feng; Hu, Guang-Min

    2016-03-01

    Seismic texture attributes are closely related to seismic facies and reservoir characteristics and are thus widely used in seismic data interpretation. However, information is mislaid in the stacking process when traditional texture attributes are extracted from post-stack data, which is detrimental to complex reservoir description. In this study, pre-stack texture attributes are introduced, these attributes can not only capable of precisely depicting the lateral continuity of waveforms between different reflection points but also reflect amplitude versus offset, anisotropy, and heterogeneity in the medium. Due to its strong ability to represent stratigraphics, a pre-stack-data-based seismic facies analysis method is proposed using the self-organizing map algorithm. This method is tested on wide azimuth seismic data from China, and the advantages of pre-stack texture attributes in the description of stratum lateral changes are verified, in addition to the method's ability to reveal anisotropy and heterogeneity characteristics. The pre-stack texture classification results effectively distinguish different seismic reflection patterns, thereby providing reliable evidence for use in seismic facies analysis.

  5. Seismic analysis of the large 70-meter antenna. Part 2: General dynamic response and a seismic safety check

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    An extensive dynamic analysis for the new JPL 70-meter antenna structure is presented. Analytical procedures are based on the normal mode decomposition which include dumping and special forcing functions. The dynamic response can be obtained for any arbitrarily selected point on the structure. A new computer program for computing the time-dependent, resultant structural displacement, summing the effects of all participating modes, was developed also. Program compatibility with natural frequency analysis output was verified. The program was applied to the JPL 70-meter antenna structure and the dynamic response for several specially selected points was computed. Seismic analysis of structures, a special application of the general dynamic analysis, is based also on the normal modal decomposition. Strength specification of the antenna, with respect to the earthquake excitation, is done by using the common response spectra. The results indicated basically a safe design under an assumed 5% or more damping coefficient. However, for the antenna located at Goldstone, with more active seismic environment, this study strongly recommends and experimental program that determines the true damping coefficient for a more reliable safety check.

  6. Seismic analysis of the large 70-meter antenna. Part 2: General dynamic response and a seismic safety check

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    An extensive dynamic analysis for the new JPL 70-meter antenna structure is presented. Analytical procedures are based on the normal mode decomposition which include dumping and special forcing functions. The dynamic response can be obtained for any arbitrarily selected point on the structure. A new computer program for computing the time-dependent, resultant structural displacement, summing the effects of all participating modes, was developed also. Program compatibility with natural frequency analysis output was verified. The program was applied to the JPL 70-meter antenna structure and the dynamic response for several specially selected points was computed. Seismic analysis of structures, a special application of the general dynamic analysis, is based also on the normal modal decomposition. Strength specification of the antenna, with respect to the earthquake excitation, is done by using the common response spectra. The results indicated basically a safe design under an assumed 5% or more damping coefficient. However, for the antenna located at Goldstone, with more active seismic environment, this study strongly recommends and experimental program that determines the true damping coefficient for a more reliable safety check.

  7. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    SciTech Connect

    Nie, J.; Braverman, J.; Hofmayer, C; Choun, Y-S; Kim, MK; Choi, I-K

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structures and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5

  8. Seismic detection and analysis of icequakes at Columbia Glacier, Alaska

    USGS Publications Warehouse

    O'Neel, Shad; Marshall, Hans P.; McNamara, Daniel E.; Pfeffer, William Tad

    2007-01-01

    Contributions to sea level rise from rapidly retreating marine-terminating glaciers are large and increasing. Strong increases in iceberg calving occur during retreat, which allows mass transfer to the ocean at a much higher rate than possible through surface melt alone. To study this process, we deployed an 11-sensor passive seismic network at Columbia Glacier, Alaska, during 2004–2005. We show that calving events generate narrow-band seismic signals, allowing frequency domain detections. Detection parameters were determined using direct observations of calving and validated using three statistical methods and hypocenter locations. The 1–3 Hz detections provide a good measure of the temporal distribution and size of calving events. Possible source mechanisms for the unique waveforms are discussed, and we analyze potential forcings for the observed seismicity.

  9. Seismic analysis of series isolation system based on geometry nonlinearity

    NASA Astrophysics Data System (ADS)

    Lin, Z. D.; Shi, H.; Xue, L.

    2017-08-01

    According to the system of rubber bearing serially connected with column, the mathematical model of serially isolated system based on geometric nonlinear is investigated by using Hamilton’s principle. The effects of axial pressure and difference column size to the series isolation system in seismic response is discussed. The series isolation system dynamics model based on geometric nonlinear is established considering the cross section rotated and the influence of the shear deformation and axial pressure. The differential quadrature element method is employed for discrete processing on governing equations and boundary conditions. Seismic response of series isolation system subjected to the far-field ground motions is solved numerically. Results show that: the slenderness ratio of cantilever column will significantly affect the seismic response of the isolation system under far-field ground motions, and it is particularly to response of the cantilever column.

  10. Anisotropic P-wave velocity analysis and seismic imaging in onshore Kutch sedimentary basin of India

    NASA Astrophysics Data System (ADS)

    Behera, Laxmidhar; Khare, Prakash; Sarkar, Dipankar

    2011-08-01

    The long-offset P-wave seismic reflection data has observable non-hyperbolic moveout, which depend on two parameters such as normal moveout velocity ( Vnmo) and the anisotropy parameter( η). Anisotropy (e.g., directional dependence of velocity at a fixed spatial location in a medium) plays an important role in seismic imaging. It is difficult to know the presence of anisotropy in the subsurface geological formations only from P-wave seismic data and special analysis is required for this. The presence of anisotropy causes two major distortions of moveout in P-wave seismic reflection data. First, in contrast to isotropic media, normal-moveout (NMO) velocity differs from the vertical velocity; and the second is substantial increase of deviations in hyperbolic moveout in an anisotropic layer. Hence, with the help of conventional velocity analysis based on short-spread moveout (stacking) velocities do not provide enough information to determine the true vertical velocity in a transversely isotropic media with vertical symmetry axis (VTI media). Therefore, it is essential to estimate the single anisotropic parameter ( η) from the long-offset P-wave seismic data. It has been demonstrated here as a case study with long-offset P-wave seismic data acquired in onshore Kutch sedimentary basin of western India that suitable velocity analysis using Vnmo and η can improve the stacking image obtained from conventional velocity analysis.

  11. Health adaptation policy for climate vulnerable groups: a 'critical computational linguistics' analysis.

    PubMed

    Seidel, Bastian M; Bell, Erica

    2014-11-28

    Many countries are developing or reviewing national adaptation policy for climate change but the extent to which these meet the health needs of vulnerable groups has not been assessed. This study examines the adequacy of such policies for nine known climate-vulnerable groups: people with mental health conditions, Aboriginal people, culturally and linguistically diverse groups, aged people, people with disabilities, rural communities, children, women, and socioeconomically disadvantaged people. The study analyses an exhaustive sample of national adaptation policy documents from Annex 1 ('developed') countries of the United Nations Framework Convention on Climate Change: 20 documents from 12 countries. A 'critical computational linguistics' method was used involving novel software-driven quantitative mapping and traditional critical discourse analysis. The study finds that references to vulnerable groups are relatively little present or non-existent, as well as poorly connected to language about practical strategies and socio-economic contexts, both also little present. The conclusions offer strategies for developing policy that is better informed by a 'social determinants of health' definition of climate vulnerability, consistent with best practice in the literature and global policy prescriptions.

  12. New Methodology for Rapid Seismic Risk Assessment

    NASA Astrophysics Data System (ADS)

    Melikyan, A. E.; Balassanian, S. Y.

    2002-05-01

    Seismic risk is growing worldwide and is, increasingly, a problem of developing countries. Along with growing urbanization future earthquakes will have more disastrous social and economic consequences. Seismic risk assessment and reduction are important goals for each country located in seismically active zone. For Armenia these goals are of primary importance because the results of studies carried out by Armenian NSSP for assessment of the losses caused by various types of disasters in Armenia had shown that earthquakes are the most disastrous hazard for Armenia. The strategy for seismic risk reduction in 1999 was adopted by the Government of Armenia as a high priority state program. The world experience demonstrates that for efficient response the rapid assessment of seismic losses is necessary. There are several state-of-the-art approaches for seismic risk assessment (Radius, Hazus, etc.). All of them required large amount of various input data, which is impossible to collect in many developing countries, in particular in Armenia. Taking into account this very serious problem existing for developing countries, as well as rapid seismic risk assessment need immediately after strong earthquake the author undertake the attempt to contribute into a new approach for rapid seismic risk assessment under the supervision of Prof. S. Balassanian. The analysis of numerous factors influencing seismic risk in Armenia shows that the following elements contribute most significantly to the possible losses: seismic hazard; density of population; vulnerability of structures. Proposed approach for rapid seismic risk assessment based on these three factors has been tested for several seismic events. These tests have shown that such approach might represent from 80 to 90 percent of real losses.

  13. Seismic analysis of a reinforced concrete containment vessel model

    SciTech Connect

    RANDY,JAMES J.; CHERRY,JEFFERY L.; RASHID,YUSEF R.; CHOKSHI,NILESH

    2000-02-03

    Pre-and post-test analytical predictions of the dynamic behavior of a 1:10 scale model Reinforced Concrete Containment Vessel are presented. This model, designed and constructed by the Nuclear Power Engineering Corp., was subjected to seismic simulation tests using the high-performance shaking table at the Tadotsu Engineering Laboratory in Japan. A group of tests representing design-level and beyond-design-level ground motions were first conducted to verify design safety margins. These were followed by a series of tests in which progressively larger base motions were applied until structural failure was induced. The analysis was performed by ANATECH Corp. and Sandia National Laboratories for the US Nuclear Regulatory Commission, employing state-of-the-art finite-element software specifically developed for concrete structures. Three-dimensional time-history analyses were performed, first as pre-test blind predictions to evaluate the general capabilities of the analytical methods, and second as post-test validation of the methods and interpretation of the test result. The input data consisted of acceleration time histories for the horizontal, vertical and rotational (rocking) components, as measured by accelerometers mounted on the structure's basemat. The response data consisted of acceleration and displacement records for various points on the structure, as well as time-history records of strain gages mounted on the reinforcement. This paper reports on work in progress and presents pre-test predictions and post-test comparisons to measured data for tests simulating maximum design basis and extreme design basis earthquakes. The pre-test analyses predict the failure earthquake of the test structure to have an energy level in the range of four to five times the energy level of the safe shutdown earthquake. The post-test calculations completed so far show good agreement with measured data.

  14. Preliminary Analysis of Saudi National Seismic Network Recording of the November 1999 Dead Sea Explosions

    SciTech Connect

    Rodgers, A.

    1999-12-01

    Two large chemical explosions were detonated in the Dead Sea on November 10 and 11, 1999 for the purposes of calibrating seismic travel times to improve regional network location. These explosions were large enough to be observed with good signal-to-noise ratios by seismic stations in northwestern Saudi Arabia (distances c 500 km). In this report, we present a preliminary analysis of the recordings from these shots.

  15. Comparison between seismic and domestic risk in moderate seismic hazard prone region: the Grenoble City (France) test site

    NASA Astrophysics Data System (ADS)

    Dunand, F.; Gueguen, P.

    2012-02-01

    France has a moderate level of seismic activity, characterized by diffuse seismicity, sometimes experiencing earthquakes of a magnitude of more than 5 in the most active zones. In this seismicity context, Grenoble is a city of major economic and social importance. However, earthquakes being rare, public authorities and the decision makers are only vaguely committed to reducing seismic risk: return periods are long and local policy makers do not have much information available. Over the past 25 yr, a large number of studies have been conducted to improve our knowledge of seismic hazard in this region. One of the decision-making concerns of Grenoble's public authorities, as managers of a large number of public buildings, is to know not only the seismic-prone regions, the variability of seismic hazard due to site effects and the city's overall vulnerability, but also the level of seismic risk and exposure for the entire city, also compared to other natural or/and domestic hazards. Our seismic risk analysis uses a probabilistic approach for regional and local hazards and the vulnerability assessment of buildings. Its applicability to Grenoble offers the advantage of being based on knowledge acquired by previous projects conducted over the years. This paper aims to compare the level of seismic risk with that of other risks and to introduce the notion of risk acceptability in order to offer guidance in the management of seismic risk. This notion of acceptability, which is now part of seismic risk consideration for existing buildings in Switzerland, is relevant in moderately seismic-prone countries like France.

  16. Fast seismic velocity analysis using parsimonious Kirchhoff depth migration

    NASA Astrophysics Data System (ADS)

    Fei, Weihong

    Migration-based velocity analysis is the most efficient, and accurate velocity inversion technique. It generally involves time-consuming prestack depth migration, and picking of the depth residuals in common-image gathers (CIGs) in each iteration. Two modifications are proposed to minimize the time of prestack depth migration and the picking work in velocity analysis: one approach is to invert the velocity model in layer-stripping style; the other is based on a grid parametrization of the velocity model. Both approaches are based on the idea of parsimonious depth migration, which is the fastest depth migration currently available. Both approaches have four basic steps: (1) Picking the primary, most consistent reflection events from one reference seismic section or volume. (2) Depending on whether the reference data is 2-D poststack, 2-D common-offset, 3-D poststack, or 3-D common-offset, the corresponding parsimonious depth migration is used to migrate all the picked time samples to their spatial locations and to give their orientations. (3) Ray-tracing defines the CRP gathers for each reflection point. (4) Velocity updating. For the layer-stripping approach, a small (2-3) number of iterations converge to a 2-D model of layer shape and interval velocity. The computation time of this layer-stripping approach is of the same order as that of the standard (1-D) rms velocity scan method, and is much faster than current iterative prestack depth migration velocity analysis methods for typical field data. For the grid-based approach, it is not necessary to define continuous reflectors and that the time at any offset (not only zero offset) can be used as the reference time for a reflection. Truncations, and multi-valued layers, which need much effort in the layer-stripping approach, are handled naturally and implicitly in the grid-based approach. Two important features of the proposed algorithms are: the traveltime picking is limited to only a stacked or common

  17. Structured Assessment Approach: a microcomputer-based insider-vulnerability analysis tool

    SciTech Connect

    Patenaude, C.J.; Sicherman, A.; Sacks, I.J.

    1986-01-01

    The Structured Assessment Approach (SAA) was developed to help assess the vulnerability of safeguards systems to insiders in a staged manner. For physical security systems, the SAA identifies possible diversion paths which are not safeguarded under various facility operating conditions and insiders who could defeat the system via direct access, collusion or indirect tampering. For material control and accounting systems, the SAA identifies those who could block the detection of a material loss or diversion via data falsification or equipment tampering. The SAA, originally desinged to run on a mainframe computer, has been converted to run on a personal computer. Many features have been added to simplify and facilitate its use for conducting vulnerability analysis. For example, the SAA input, which is a text-like data file, is easily readable and can provide documentation of facility safeguards and assumptions used for the analysis.

  18. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk

    NASA Astrophysics Data System (ADS)

    Weatherill, G. A.; Silva, V.

    2011-12-01

    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building typology or infrastructure, or independent analyses of separate typologies or infrastructures, are insufficient to fully characterise the potential impacts that arise from inter-connected system failure. Individual elements of each infrastructure may be adversely affected by different facets of the ground motion (e.g. short-period acceleration, long-period displacement, cumulative energy input etc.). The accuracy and efficiency of the risk analysis is dependent on the ability to characterise these multiple features of the ground motion over a spatially distributed portfolio of elements. The modelling challenges raised by this extension to multi-system analysis of risk have been a key focus of the European Project "Systemic Seismic Vulnerability and Risk Analysis for Buildings, Lifeline Networks and Infrastructures Safety Gain (SYNER-G)", and are expected to be developed further within the Global Earthquake Model (GEM). Seismic performance of a spatially distributed infrastructure during an earthquake may be assessed by means of Monte Carlo simulation, in order to incorporate the aleatory variability of the ground motion into the network analysis. Methodologies for co-simulating large numbers of spatially cross-correlated ground motion fields are appraised, and their potential impacts on a spatially distributed portfolio of mixed building typologies assessed using idealised case study scenarios from California and Europe. Potential developments to incorporate correlation and uncertainty in site amplification and geotechnical hazard are also explored. Whilst the initial application of the seismic risk analysis is

  19. Seismic structural analysis of a glovebox by the equivalent static method

    SciTech Connect

    Hsieh, B.J.

    1994-06-01

    Seismic strength evaluation of equipment requires efficient and accurate methods. Such an evaluation generally calls for dynamic analysis requiring detailed accelerations and advanced mathematical modeling. The analysis may be tedious, but in theory works for any structure with any boundary conditions. Many equipment do not justify such expansive and expensive evaluation; hence, efficient and inexpensive, but may be more conservative, methods of analysis are used instead. The equivalent static method (ESM) is such a method. Being a static method, the ESM can not be directly applied to equipment that are not simply anchored to or only rest on the ground. In this paper, we show how a glovebox with ambiguous anchorage conditions is analyzed by the ESM when subjected to the seismic load. Also outlined are the retrofits to increase its seismic resistance. The recommendations include fixing the legs to the floor and using inclined braces. The use of braces is effective in resisting the lateral seismic. It redistributes the seismic-generated moment and force in a more benign way. It also significantly stiffens the glovobox`s supporting table structure, thus raising the vibration frequency of the table away from the high-energy range of the seismic load and drastically reduces the displacement of the glovebox.

  20. An exploratory spatial analysis of social vulnerability and smoke plum dispersion in the U.S

    Treesearch

    Cassandra Johnson Gaither; Scott Goodrick; Bryn Elise Murphy; Neelam Poudyal

    2015-01-01

    This study explores the spatial association between social vulnerability and smoke plume dispersion at the census block group level for the 13 southern states in the USDA Forest Service’s Region 8. Using environmental justice as a conceptual basis, we use Exploratory Spatial Data Analysis to identify clusters or “hot spots” for the incidence of both higher than average...

  1. An Analysis of the Vulnerability of Global Drinking Water Access to Climate-related Hazards

    NASA Astrophysics Data System (ADS)

    Elliott, M.; Banerjee, O.; Christenson, E.; Holcomb, D.; Hamrick, L.; Bartram, J.

    2014-12-01

    Global drinking water access targets are formulated around "sustainable access." Global climate change (GCC) and associated hazards threaten the sustainability of drinking water supply. Extensive literature exists on the impacts of GCC on precipitation and water resources. However, the literature lacks a credible analysis of the vulnerability of global drinking water access. This research reports on an analysis of the current vulnerability of drinking water access due to three climate-related hazardous events: cyclone, drought and flood. An ArcGIS database was built incorporating the following: population density, hazardous event frequency, drinking water technologies in use and adaptive capacity. Two global grids were incorporated first: (1) LandScanTM global population distribution; and (2) frequency of cyclone, drought and flood from ~1980-2000 from Columbia University Center for Hazards Risk Research (CHRR). Population density was used to characterize cells as urban or rural and country-level urban/rural drinking water technologies in use were added based on the WHO/UNICEF Joint Monitoring Programme data. Expert assessment of the resilience of each technology to each hazardous event based on WHO/DFID Vision 2030 were quantified and added to the database. Finally, country-level adaptive capacity was drawn from the "readiness" parameter of the Global Adaptation Index (GaIn). ArcGIS Model Builder and Python were used to automate the addition of datasets. This presentation will report on the results of this analysis, the first credible attempt to assess the vulnerability of global drinking water access to climate-related hazardous events. This analysis has yielded country-level scores and maps displaying the ranking of exposure score (for flood, drought, cyclone, and all three in aggregate) and the corresponding country-level vulnerability scores and rankings incorporating the impact of drinking water technologies and adaptive capacity (Figure 1).

  2. Governing Geoengineering Research: A Political and Technical Vulnerability Analysis of Potential Near-Term Options

    DTIC Science & Technology

    2011-01-01

    weather events, or the spread of tropical diseases into North America. The net A Vulnerability-and-Response-Option Analysis Framework for a Risk...Avoidable Surprises, Cambridge: Cambridge University Press, 2002. Doney, Scott C., Victoria J. Fabry , Richard A. Feely, and Joan A. Kleypas, “Ocean...Falkenmark, Louise Karlberg, Robert W. Corell, Victoria J. Fabry , James Hansen, Brian Walker, Diana Liverman, Katherine Richardson, Paul Crutzen, and

  3. Social class variation in risk: a comparative analysis of the dynamics of economic vulnerability.

    PubMed

    Whelan, Christopher T; Maître, Bertrand

    2008-12-01

    A joint concern with multidimensionality and dynamics is a defining feature of the pervasive use of the terminology of social exclusion in the European Union. The notion of social exclusion focuses attention on economic vulnerability in the sense of exposure to risk and uncertainty. Sociological concern with these issues has been associated with the thesis that risk and uncertainty have become more pervasive and extend substantially beyond the working class. This paper combines features of recent approaches to statistical modelling of poverty dynamics and multidimensional deprivation in order to develop our understanding of the dynamics of economic vulnerability. An analysis involving nine countries and covering the first five waves of the European Community Household Panel shows that, across nations and time, it is possible to identify an economically vulnerable class. This class is characterized by heightened risk of falling below a critical resource level, exposure to material deprivation and experience of subjective economic stress. Cross-national differentials in persistence of vulnerability are wider than in the case of income poverty and less affected by measurement error. Economic vulnerability profiles vary across welfare regimes in a manner broadly consistent with our expectations. Variation in the impact of social class within and across countries provides no support for the argument that its role in structuring such risk has become much less important. Our findings suggest that it is possible to accept the importance of the emergence of new forms of social risk and acknowledge the significance of efforts to develop welfare states policies involving a shift of opportunities and decision making on to individuals without accepting the 'death of social class' thesis.

  4. E-ELT seismic devices analysis and prototype testing

    NASA Astrophysics Data System (ADS)

    Gómez, Celia; Avilés, Alexander; Bilbao, Armando; Siepe, Daniel; Nawrotzki, Peter

    2012-09-01

    During the E-ELT Dome and Foundations FEED Study, IDOM developed a Base Control System for protection of the E-ELT Main Structure against the effect of high level earthquakes. The proposed design was aimed to provide an effective isolation during heavy seismic events, whereas in normal observation conditions it presented a high stiffness to avoid interferences with the pointing accuracy of the telescope. In a subsequent phase, a representative prototype was envisaged by IDOM, in close collaboration with GERB, to evaluate the performance of this system, correlate the results from prototype testing with the behaviour predicted by a calculation model and finally validate the design conceived during the FEED Study. The assessment of the results from the prototype tests has been focused on checking the level of compliance with the demanded requirements: 1) the Base Control System isolates the upper structure from ground in case of high magnitude seismic events; 2) in operational conditions, the system -by means of Preloaded Devices (PLDs)- provides a stiff interface with the ground; 3) regarding the performance of the PLDs, the finite element model simulates accurately the non-linear behaviour, particularly the zero crossing when the direction of the excitation changes; 4) there is no degradation of the stiffness properties of the seismic devices, after being submitted to a heavy seismic event. The prototype was manufactured by GERB and pseudo-integrated tests were performed on a shaking table at the premises of the Institute of Earthquake Engineering (IZIIS) in Skopje, Macedonia.

  5. Assessing the seismic risk potential of South America

    USGS Publications Warehouse

    Jaiswal, Kishor; Petersen, Mark D.; Harmsen, Stephen; Smoczyk, Gregory M.

    2016-01-01

    We present here a simplified approach to quantifying regional seismic risk. The seismic risk for a given region can be inferred in terms of average annual loss (AAL) that represents long-term value of earthquake losses in any one year caused from a long-term seismic hazard. The AAL are commonly measured in the form of earthquake shaking-induced deaths, direct economic impacts or indirect losses caused due to loss of functionality. In the context of South American subcontinent, the analysis makes use of readily available public data on seismicity, population exposure, and the hazard and vulnerability models for the region. The seismic hazard model was derived using available seismic catalogs, fault databases, and the hazard methodologies that are analogous to the U.S. Geological Survey’s national seismic hazard mapping process. The Prompt Assessment of Global Earthquakes for Response (PAGER) system’s direct empirical vulnerability functions in terms of fatality and economic impact were used for performing exposure and risk analyses. The broad findings presented and the risk maps produced herein are preliminary, yet they do offer important insights into the underlying zones of high and low seismic risks in the South American subcontinent. A more detailed analysis of risk may be warranted by engaging local experts, especially in some of the high risk zones identified through the present investigation.

  6. Detailed seismicity analysis in the SE of Romania (Dobrogea region)

    NASA Astrophysics Data System (ADS)

    Rogozea, Maria; Radulian, Mircea; Ghica, Daniela; Popa, Mihaela

    2014-05-01

    The purpose of this paper is to analyze the seismicity in the south-eastern part of Romania, in the Dobrogea region (namely the Predobrogean Depression and Black Sea area). Predobrogean Depression is the name attributed to the structures belonging to the Scythian Platform. The seismic activity is moderate with most significant earthquakes at the boundary between the North Dobrogea Orogen and Scythian Platform (Sf. Gheorghe fault). The largest magnitude event was recorded in 02.11.1871 (Mw = 5.3). Other events with magnitude above 4 were observed close to Tulcea city (13.11.1981, Mw = 5.1, 03.09.204, Mw =5.1) and Galati city (11.09.1980, Mw = 4.2). Recently, an earthquake swarm of 406 events extended over two months and a half (23 September - 5 December 2013) was produced in the Galati area (maximum magnitude 3.9). The deformation field has an extensional regime, as resulted from fault plane solutions and geotectonic investigations. The maximum expected magnitude in this area is estimated at Mw = 5.5. The seismic activity in the Black Sea area, close to Romania seashore and north-east Bulgarian seashore, concentrates along Shabla fault system. Large shocks (magnitude above 7) are reported here at intervals of a few centuries. The most recent major shock was recorded on 31 January 1901 (Mw = 7.2) in Shabla region, Bulgaria. To characterize seismicity parameters, the Romanian catalogue of the National Institute of Earth Physics was used as a basic input. The catalogue was revised as concerns historical information by reanalyzing macroseismic data and for the recent events, by applying up-to-date tools to relocate and re-parametrize the seismic sources.

  7. Detailed seismicity analysis of the southern Dead Sea area

    NASA Astrophysics Data System (ADS)

    Braeuer, Benjamin; Asch, Guenter; Hofstetter, Rami; Haberland, Christian; Jaser, Darwish; El-Kelani, Radwan; Weber, Michael

    2013-04-01

    While the Dead Sea basin has been studied for a long time, the available knowledge about the micro-seismicity, its distribution and characteristics is limited. Therefore, within the framework of the international DESIRE (DEad Sea Integrated REsearch) project, a dense temporary local seismological network was operated in the southern Dead Sea area. Within 18 month of recording 650 events were detected. Based on an already published tomography study clustering, focal mechanisms, statistics and the distribution of the micro-seismicity in relation to the velocity models from the tomography are analyzed. The determined b-value of 0.7 indicates a relatively high risk of large earthquakes compared to the moderate microseismic activity. The distribution of the seismicity suggests an asymmetric basin with a vertical strike slip fault forming the eastern boundary of the basin, and an inclined western boundary, made up of strike-slip and normal faults. Furthermore, significant differences between the area North and South of the Boqeq fault were observed. South of the Boqeq fault the western boundary is inactive while the entire seismicity occurs at the eastern boundary and below the basin-fill sediments. The largest events occurred here, their focal mechanisms represent the northwards transform motion of the Arabian plate along the Dead Sea Transform. The vertical extension of the the spatial and temporal cluster from February 2007 is interpreted as being related to the locking of the region around the Boqeq fault. North of the Boqeq fault similar seismic activity occurs at both boundaries most notably within the basin-fill sediments, displaying mainly small events with strike-slip mechanism and normal faulting in EW direction. Therefore, we suggest that the Boqeq fault forms the border between the "single" transform fault and the pull-apart basin with two active border faults.

  8. Seismic soil structure interaction analysis for asymmetrical buildings supported on piled raft for the 2015 Nepal earthquake

    NASA Astrophysics Data System (ADS)

    Badry, Pallavi; Satyam, Neelima

    2017-01-01

    Seismic damage surveys and analyses conducted on modes of failure of structures during past earthquakes observed that the asymmetrical buildings show the most vulnerable effect throughout the course of failures (Wegner et al., 2009). Thus, all asymmetrical buildings significantly fails during the shaking events and it is really needed to focus on the accurate analysis of the building, including all possible accuracy in the analysis. Apart from superstructure geometry, the soil behavior during earthquake shaking plays a pivotal role in the building collapse (Chopra, 2012). Fixed base analysis where the soil is considered to be infinitely rigid cannot simulate the actual scenario of wave propagation during earthquakes and wave transfer mechanism in the superstructure (Wolf, 1985). This can be well explained in the soil structure interaction analysis, where the ground movement and structural movement can be considered with the equal rigor. In the present study the object oriented program has been developed in C++ to model the SSI system using the finite element methodology. In this attempt the seismic soil structure interaction analysis has been carried out for T, L and C types piled raft supported buildings in the recent 25th April 2015 Nepal earthquake (M = 7.8). The soil properties have been considered with the appropriate soil data from the Katmandu valley region. The effect of asymmetry of the building on the responses of the superstructure is compared with the author's research work. It has been studied/observed that the shape or geometry of the superstructure governs the response of the superstructure subjected to the same earthquake load.

  9. Advanced Seismic Fragility Modeling using Nonlinear Soil-Structure Interaction Analysis

    SciTech Connect

    Bolisetti, Chandu; Coleman, Justin; Talaat, Mohamed; Hashimoto, Philip

    2015-09-01

    The goal of this effort is to compare the seismic fragilities of a nuclear power plant system obtained by a traditional seismic probabilistic risk assessment (SPRA) and an advanced SPRA that utilizes Nonlinear Soil-Structure Interaction (NLSSI) analysis. Soil-structure interaction (SSI) response analysis for a traditional SPRA involves the linear analysis, which ignores geometric nonlinearities (i.e., soil and structure are glued together and the soil material undergoes tension when the structure uplifts). The NLSSI analysis will consider geometric nonlinearities.

  10. Singular spectral analysis based filtering of seismic signal using new Weighted Eigen Spectrogram

    NASA Astrophysics Data System (ADS)

    Rekapalli, Rajesh; Tiwari, R. K.

    2016-09-01

    Filtering of non-stationary noisy seismic signals using the fixed basis functions (sine and cosine) generates artifacts in the final output and thereby leads to wrong interpretation. In order to circumvent the problem, we propose here, a new Weighted Eigen Spectrogram (WES) based robust time domain Singular Spectrum Analysis (SSA) frequency filtering algorithm. The new WES is used to simplify the Eigen triplet grouping procedure in SSA. We tested the robustness of the algorithm on synthetic seismic data assorted with field-simulated noise. Then we applied the method to filter the high-resolution seismic reflection field data. The band pass filtering of noisy seismic records suggests that the underlying algorithm is efficient for improving the signal to noise ratio (S/N) and also it is user-friendly.

  11. HANFORD DOUBLE SHELL TANK THERMAL AND SEISMIC PROJECT SUMMARY OF COMBINED THERMAL AND OPERATING LOADS WITH SEISMIC ANALYSIS

    SciTech Connect

    MACKEY TC; DEIBLER JE; RINKER MW; JOHNSON KI; ABATT FG; KARRI NK; PILLI SP; STOOPS KL

    2009-01-15

    This report summarizes the results of the Double-Shell Tank Thermal and Operating Loads Analysis (TaLA) combined with the Seismic Analysis. This combined analysis provides a thorough, defensible, and documented analysis that will become a part of the overall analysis of record for the Hanford double-shell tanks (DSTs). The bases of the analytical work presented herein are two ANSYS{reg_sign} finite element models that were developed to represent a bounding-case tank. The TaLA model includes the effects of temperature on material properties, creep, concrete cracking, and various waste and annulus pressure-loading conditions. The seismic model considers the interaction of the tanks with the surrounding soil including a range of soil properties, and the effects of the waste contents during a seismic event. The structural evaluations completed with the representative tank models do not reveal any structural deficiencies with the integrity of the DSTs. The analyses represent 60 years of use, which extends well beyond the current date. In addition, the temperature loads imposed on the model are significantly more severe than any service to date or proposed for the future. Bounding material properties were also selected to provide the most severe combinations. While the focus of the analyses was a bounding-case tank, it was necessary during various evaluations to conduct tank-specific analyses. The primary tank buckling evaluation was carried out on a tank-specific basis because of the sensitivity to waste height, specific gravity, tank wall thickness, and primary tank vapor space vacuum limit. For this analysis, the occurrence of maximum tank vacuum was classified as a service level C, emergency load condition. The only area of potential concern in the analysis was with the buckling evaluation of the AP tank, which showed the current limit on demand of l2-inch water gauge vacuum to exceed the allowable of 10.4 inches. This determination was based on analysis at the

  12. Urban air quality in mega cities: a case study of Delhi City using vulnerability analysis.

    PubMed

    Jain, Suresh; Khare, Mukesh

    2008-01-01

    Air pollution is one of the major environmental problems in India, affecting health of thousands of 'urban' residents residing in mega cities. The need of the day is to evolve an 'effective' and 'efficient' air quality management plan (AQMP) encompassing the essential 'key players' and 'stakeholders.' This paper describes the formulation of an AQMP for mega cities like Delhi in India taking into account the aforementioned key 'inputs.' The AQMP formulation methodology is based on past studies of Longhurst et al., (Atmospheric Environment, 30, 3975-3985, 1996); Longhurst & Elsom, ((1997). Air Pollution-II, Vol. 2 (pp. 525-532)) and Beatti et al., (Atmospheric Environment, 35, 1479-1490, 2001). Further, the vulnerability analysis (VA) has been carried out to evaluate the stresses due to air pollution in the study area. The VA has given the vulnerability index (VI) of 'medium to high' and 'low' at urban roadways/intersections and residential areas, respectively.

  13. A relative vulnerability estimation of flood disaster using data envelopment analysis in the Dongting Lake region of Hunan

    NASA Astrophysics Data System (ADS)

    Li, C.-H.; Li, N.; Wu, L.-C.; Hu, A.-J.

    2013-07-01

    The vulnerability to flood disaster is addressed by a number of studies. It is of great importance to analyze the vulnerability of different regions and various periods to enable the government to make policies for distributing relief funds and help the regions to improve their capabilities against disasters, yet a recognized paradigm for such studies seems missing. Vulnerability is defined and evaluated through either physical or economic-ecological perspectives depending on the field of the researcher concerned. The vulnerability, however, is the core of both systems as it entails systematic descriptions of flood severities or disaster management units. The research mentioned often has a development perspective, and in this article we decompose the overall flood system into several factors: disaster driver, disaster environment, disaster bearer, and disaster intensity, and take the interaction mechanism among all factors as an indispensable function. The conditions of flood disaster components are demonstrated with disaster driver risk level, disaster environment stability level and disaster bearer sensitivity, respectively. The flood system vulnerability is expressed as vulnerability = f(risk, stability, sensitivity). Based on the theory, data envelopment analysis method (DEA) is used to detail the relative vulnerability's spatiotemporal variation of a flood disaster system and its components in the Dongting Lake region. The study finds that although a flood disaster system's relative vulnerability is closely associated with its components' conditions, the flood system and its components have a different vulnerability level. The overall vulnerability is not the aggregation of its components' vulnerability. On a spatial scale, zones central and adjacent to Dongting Lake and/or river zones are characterized with very high vulnerability. Zones with low and very low vulnerability are mainly distributed in the periphery of the Dongting Lake region. On a temporal

  14. Statistical modeling of ground motion relations for seismic hazard analysis

    NASA Astrophysics Data System (ADS)

    Raschke, Mathias

    2013-10-01

    We introduce a new approach for ground motion relations (GMR) in the probabilistic seismic hazard analysis (PSHA), being influenced by the extreme value theory of mathematical statistics. Therein, we understand a GMR as a random function. We derive mathematically the principle of area equivalence, wherein two alternative GMRs have an equivalent influence on the hazard if these GMRs have equivalent area functions. This includes local biases. An interpretation of the difference between these GMRs (an actual and a modeled one) as a random component leads to a general overestimation of residual variance and hazard. Beside this, we discuss important aspects of classical approaches and discover discrepancies with the state of the art of stochastics and statistics (model selection and significance, test of distribution assumptions, extreme value statistics). We criticize especially the assumption of logarithmic normally distributed residuals of maxima like the peak ground acceleration (PGA). The natural distribution of its individual random component (equivalent to exp( ɛ 0) of Joyner and Boore, Bull Seism Soc Am 83(2):469-487, 1993) is the generalized extreme value. We show by numerical researches that the actual distribution can be hidden and a wrong distribution assumption can influence the PSHA negatively as the negligence of area equivalence does. Finally, we suggest an estimation concept for GMRs of PSHA with a regression-free variance estimation of the individual random component. We demonstrate the advantages of event-specific GMRs by analyzing data sets from the PEER strong motion database and estimate event-specific GMRs. Therein, the majority of the best models base on an anisotropic point source approach. The residual variance of logarithmized PGA is significantly smaller than in previous models. We validate the estimations for the event with the largest sample by empirical area functions, which indicate the appropriate modeling of the GMR by an anisotropic

  15. Seismic Hazard characterization study using an earthquake source with Probabilistic Seismic Hazard Analysis (PSHA) method in the Northern of Sumatra

    NASA Astrophysics Data System (ADS)

    Yahya, A.; Palupi, M. I. R.; Suharsono

    2016-11-01

    Sumatra region is one of the earthquake-prone areas in Indonesia because it is lie on an active tectonic zone. In 2004 there is earthquake with a moment magnitude of 9.2 located on the coast with the distance 160 km in the west of Nanggroe Aceh Darussalam and triggering a tsunami. These events take a lot of casualties and material losses, especially in the Province of Nanggroe Aceh Darussalam and North Sumatra. To minimize the impact of the earthquake disaster, a fundamental assessment of the earthquake hazard in the region is needed. Stages of research include the study of literature, collection and processing of seismic data, seismic source characterization and analysis of earthquake hazard by probabilistic methods (PSHA) used earthquake catalog from 1907 through 2014. The earthquake hazard represented by the value of Peak Ground Acceleration (PGA) and Spectral Acceleration (SA) in the period of 0.2 and 1 second on bedrock that is presented in the form of a map with a return period of 2475 years and the earthquake hazard curves for the city of Medan and Banda Aceh.

  16. Analysis of broadband seismic noise at the German Regional Seismic Network and search for improved alternative station sites

    NASA Astrophysics Data System (ADS)

    Bormann, P.; Wylegalla, K.; Klinge, K.

    The German Regional Seismic Network (GRSN) comprizes now 16 digital broadband stations equipped with Wieland-Streckeisen STS-2 seismometers, 24-bit dataloggers and a seismological data center at Erlangen. It covers the whole territory of Germany with station-spacings between 80 km to 240 km. The stations are sited in very different environments ranging from near shore at the Baltic Sea coast up to distances of about 700 km from the coast, both within cities and up to about 10 km away from any major settlement, industry or traffic roads. The underground varies from outcropping hard rocks in Hercynian mountain areas, sedimentary rocks in areas of Mesozoic platform cover to up to 1.5 km unconsolidated Quarternary and Tertiary subsoil. Accordingly, seismic background noise varies in a wide range between the upper and lower bounds of the new global noise model. The noise conditions at the GRSN have been investigated systematically by means of displacement power spectral analysis within the frequency range 10-2 5 for RUE and > 10 for BSEG have been confirmed for frequencies between about 0.6 Hz 3 Hz. Strong lateral velocity and impedance contrasts between the outcropping Triassic/Permian sedimentary rocks and the surrounding unconsolidated Quarternary/Tertiary sediments are shown to be the main cause for the strong noise reduction and signal-to-noise ratio improvement at RUE and can account for about 50% of the noise reduction at BSEG.

  17. Seismic response analysis of NAGRA-Net stations using advanced geophysical techniques

    NASA Astrophysics Data System (ADS)

    Poggi, Valerio; Edwards, Benjamin; Dal Moro, Giancarlo; Keller, Lorenz; Fäh, Donat

    2015-04-01

    In cooperation with the National Cooperative for the Disposal of Radioactive Waste (Nagra), the Swiss Seismological Service (SED) has recently completed the installation of ten new seismological observation stations, three of them including a co-located borehole sensor. The ultimate goal of the project is to densify the existing Swiss Digital Seismic Network (SDSNet) in northern Switzerland, in order to improve the detection of very-low magnitude events and to improve the accuracy of future location solutions. This is strategic for unbiased monitoring of micro seismicity at the locations of proposed nuclear waste repositories. To further improve the quality and usability of the recordings, a seismic characterization of the area surrounding the installation area was performed at each site. The investigation consisted of a preliminary geological and geotechnical study, followed by a seismic site response analysis by means of state-of-the-art geophysical techniques. For the borehole stations, in particular, the characterization was performed by combining different types of active seismic methods (P-S refraction tomography, surface wave analysis, Vertical Seismic Profiling - VSP) with ambient vibration based approaches (wavelet decomposition, H/V spectral ratio, polarization analysis, three-component f-k analysis). The results of all analyses converged to the definition of a mean velocity profile for the site, which was later used for the computation of engineering parameters (travel time average velocity and quarter-wavelength parameters) and the analytical SH-wave transfer function. Empirical site-amplification functions are automatically determined for any station connected to the Swiss seismic networks. They are determined based on building statistical models of systematic site-specific effects in recordings of small earthquakes when compared to the Swiss stochastic ground-motion model. Computed site response is validated through comparison with these empirical

  18. Discrimination between induced and natural seismicity by means of nonlinear analysis

    NASA Astrophysics Data System (ADS)

    Turuntaev, S. B.; Melchaeva, O. Yu.; Vorohobina, S. V.

    2012-04-01

    Uch-Terek Rivers in Kyrgyzstan; (3) the seismicity in the region of the Geysers geothermal complex in California, US; (4) the seismicity in the region of Bishkek geophysical test site, Kyrgyzstan, recorded before and after strong electromagnetic discharges. The nonlinear analysis of the data sets on seismicity showed that technogeneous action on the geophysical medium increases the regularity of the seismic regime. It looks like the formation of stable states characterized by a finite fractal dimension of the attractor and reasonable small dimension of the embedding space. The presence of the stable states opens the possibility of forecasting the development of induced seismic activity. We also present the results of nonlinear analysis of the rate-and-state model, which allows us to describe the mechanics of the studied phenomenon. In this context, the model of motion in the fault zones that obey the two-parameters friction law suggests that if the external action causes the critical stresses to decrease e.g. due to the growth of the pore pressure or due to heating of the fault zone, we should expect the deterministic component of the seismic process to increase.

  19. Caucasus Seismic Information Network: Data and Analysis Final Report

    SciTech Connect

    Randolph Martin; Mary Krasovec; Spring Romer; Timothy O'Connor; Emanuel G. Bombolakis; Youshun Sun; Nafi Toksoz

    2007-02-22

    The geology and tectonics of the Caucasus region (Armenia, Azerbaijan, and Georgia) are highly variable. Consequently, generating a structural model and characterizing seismic wave propagation in the region require data from local seismic networks. As of eight years ago, there was only one broadband digital station operating in the region – an IRIS station at Garni, Armenia – and few analog stations. The Caucasus Seismic Information Network (CauSIN) project is part of a nulti-national effort to build a knowledge base of seismicity and tectonics in the region. During this project, three major tasks were completed: 1) collection of seismic data, both in event catalogus and phase arrival time picks; 2) development of a 3-D P-wave velocity model of the region obtained through crustal tomography; 3) advances in geological and tectonic models of the region. The first two tasks are interrelated. A large suite of historical and recent seismic data were collected for the Caucasus. These data were mainly analog prior to 2000, and more recently, in Georgia and Azerbaijan, the data are digital. Based on the most reliable data from regional networks, a crustal model was developed using 3-D tomographic inversion. The results of the inversion are presented, and the supporting seismic data are reported. The third task was carried out on several fronts. Geologically, the goal of obtaining an integrated geological map of the Caucasus on a scale of 1:500,000 was initiated. The map for Georgia has been completed. This map serves as a guide for the final incorporation of the data from Armenia and Azerbaijan. Description of the geological units across borders has been worked out and formation boundaries across borders have been agreed upon. Currently, Armenia and Azerbaijan are working with scientists in Georgia to complete this task. The successful integration of the geologic data also required addressing and mapping active faults throughout the greater Caucasus. Each of the major

  20. Tempo-spatial analysis of Fennoscandian intraplate seismicity

    NASA Astrophysics Data System (ADS)

    Roberts, Roland; Lund, Björn

    2017-04-01

    Coupled spatial-temporal patterns of the occurrence of earthquakes in Fennoscandia are analysed using non-parametric methods. The occurrence of larger events is unambiguously and very strongly temporally clustered, with major implications for the assessment of seismic hazard in areas such as Fennoscandia. In addition, there is a clear pattern of geographical migration of activity. Data from the Swedish National Seismic Network and a collated international catalogue are analysed. Results show consistent patterns on different spatial and temporal scales. We are currently investigating these patterns in order to assess the statistical significance of the tempo-spatial patterns, and to what extent these may be consistent with stress transfer mechanism such as coulomb stress and pore fluid migration. Indications are that some further mechanism is necessary in order to explain the data, perhaps related to post-glacial uplift, which is up to 1cm/year.

  1. Modal seismic analysis of a nuclear power plant control panel and comparison with SAP 4

    NASA Technical Reports Server (NTRS)

    Pamidi, M. R.; Pamidi, P. R.

    1976-01-01

    The application of NASTRAN to seismic analysis by considering the example of a nuclear power plant control panel was considered. A modal analysis of a three-dimensional model of the panel, consisting of beam and quadri-lateral membrane elements, is performed. Using the results of this analysis and a typical response spectrum of an earthquake, the seismic response of the structure is obtained. ALTERs required to the program in order to compute the maximum modal responses as well as the resultant response are given. The results are compared with those obtained by using the SAP IV computer program.

  2. Magma intrusion near Volcan Tancitaro: Evidence from seismic analysis

    SciTech Connect

    Pinzon, Juan I.; Nunez-Cornu, Francisco J.; Rowe, Charlotte Anne

    2016-11-17

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ~1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. We used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9–10 km and 3–4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ~5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. As a result, these features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.

  3. Magma intrusion near Volcan Tancitaro: Evidence from seismic analysis

    DOE PAGES

    Pinzon, Juan I.; Nunez-Cornu, Francisco J.; Rowe, Charlotte Anne

    2016-11-17

    Between May and June 2006, an earthquake swarm occurred near Volcan Tancítaro in Mexico, which was recorded by a temporary seismic deployment known as the MARS network. We located ~1000 events from this seismic swarm. Previous earthquake swarms in the area were reported in the years 1997, 1999 and 2000. We relocate and analyze the evolution and properties of the 2006 earthquake swarm, employing a waveform cross-correlation-based phase repicking technique. Hypocenters from 911 events were located and divided into eighteen families having a correlation coefficient at or above 0.75. 90% of the earthquakes provide at least sixteen phase picks. Wemore » used the single-event location code Hypo71 and the P-wave velocity model used by the Jalisco Seismic and Accelerometer Network to improve hypocenters based on the correlation-adjusted phase arrival times. We relocated 121 earthquakes, which show clearly two clusters, between 9–10 km and 3–4 km depth respectively. The average location error estimates are <1 km epicentrally, and <2 km in depth, for the largest event in each cluster. Depths of seismicity migrate upward from 16 to 3.5 km and exhibit a NE-SW trend. The swarm first migrated toward Paricutin Volcano but by mid-June began propagating back toward Volcán Tancítaro. In addition to its persistence, noteworthy aspects of this swarm include a quasi-exponential increase in the rate of activity within the first 15 days; a b-value of 1.47; a jug-shaped hypocenter distribution; a shoaling rate of ~5 km/month within the deeper cluster, and a composite focal mechanism solution indicating largely reverse faulting. As a result, these features of the swarm suggest a magmatic source elevating the crustal strain beneath Volcan Tancítaro.« less

  4. Receiver Function Analysis of the Eastern Tennessee Seismic Zone

    NASA Astrophysics Data System (ADS)

    Graw, J. H.; Powell, C. A.; Langston, C. A.

    2011-12-01

    We present receiver/transfer functions determined for a seismic network associated with an active, intraplate seismic zone. Basement studies within eastern Tennessee are sparse despite the fact that these rocks host the eastern Tennessee seismic zone (ETSZ) and are associated with an extensive aeromagnetic lineament called the New York-Alabama (NY-AL) lineament. The NY-AL lineament is prominent in eastern Tennessee, with a SW-NE trend, and is characterized by a lateral change in magnetic and gravity anomalies in a NW to SE direction; high magnetic and low gravity anomalies lie west of the lineament, while low magnetic and high gravity anomalies are located east of the lineament. The NY-AL lineament is thought to be an ancient strike-slip fault that is reactivating in the present day stress field. A better understanding of the basement structure within the ETSZ will aid in the assessment of its seismic hazard potential. A network maintained by the Center for Earthquake Research and Information (CERI) at the University of Memphis is located within the study area and consists of 23 short-period and three broadband seismometers. An additional station (TZTN) is maintained by IRIS and is included in our dataset. Receiver functions are computed using teleseismic earthquakes within a 30°-90° epicentral distance, at hypocentral depths greater than 30 km, and with magnitudes greater than Mw 6.0. A vertical component stack is used to obtain the best source function. A spectral waterlevel deconvolution is then used to calculate the receiver functions. Results indicate a thickening of the crust west of the NY-AL lineament and show vertical variation within the crust and upper mantle with abrupt polarity changes and strong positive and negative amplitude values. Crustal structure west of the NY-AL lineament appears to be much more complex than that east of the NY-AL lineament.

  5. Synergy of seismic, acoustic, and video signals in blast analysis

    SciTech Connect

    Anderson, D.P.; Stump, B.W.; Weigand, J.

    1997-09-01

    The range of mining applications from hard rock quarrying to coal exposure to mineral recovery leads to a great variety of blasting practices. A common characteristic of many of the sources is that they are detonated at or near the earth`s surface and thus can be recorded by camera or video. Although the primary interest is in the seismic waveforms that these blasts generate, the visual observations of the blasts provide important constraints that can be applied to the physical interpretation of the seismic source function. In particular, high speed images can provide information on detonation times of individuals charges, the timing and amount of mass movement during the blasting process and, in some instances, evidence of wave propagation away from the source. All of these characteristics can be valuable in interpreting the equivalent seismic source function for a set of mine explosions and quantifying the relative importance of the different processes. This paper documents work done at the Los Alamos National Laboratory and Southern Methodist University to take standard Hi-8 video of mine blasts, recover digital images from them, and combine them with ground motion records for interpretation. The steps in the data acquisition, processing, display, and interpretation are outlined. The authors conclude that the combination of video with seismic and acoustic signals can be a powerful diagnostic tool for the study of blasting techniques and seismology. A low cost system for generating similar diagnostics using consumer-grade video camera and direct-to-disk video hardware is proposed. Application is to verification of the Comprehensive Test Ban Treaty.

  6. Princeton Plasma Physics Laboratory (PPPL) seismic hazard analysis

    SciTech Connect

    Savy, J.

    1989-10-01

    New design and evaluation guidelines for department of energy facilities subjected to natural phenomena hazard, are being finalized. Although still in draft form at this time, the document describing those guidelines should be considered to be an update of previously available guidelines. The recommendations in the guidelines document mentioned above, and simply referred to as the guidelines'' thereafter, are based on the best information at the time of its development. In particular, the seismic hazard model for the Princeton site was based on a study performed in 1981 for Lawrence Livermore National Laboratory (LLNL), which relied heavily on the results of the NRC's Systematic Evaluation Program and was based on a methodology and data sets developed in 1977 and 1978. Considerable advances have been made in the last ten years in the domain of seismic hazard modeling. Thus, it is recommended to update the estimate of the seismic hazard at the DOE sites whenever possible. The major differences between previous estimates and the ones proposed in this study for the PPPL are in the modeling of the strong ground motion at the site, and the treatment of the total uncertainty in the estimates to include knowledge uncertainty, random uncertainty, and expert opinion diversity as well. 28 refs.

  7. Sources of Error and the Statistical Formulation of M S: m b Seismic Event Screening Analysis

    NASA Astrophysics Data System (ADS)

    Anderson, D. N.; Patton, H. J.; Taylor, S. R.; Bonner, J. L.; Selby, N. D.

    2014-03-01

    The Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global ban on nuclear explosions, is currently in a ratification phase. Under the CTBT, an International Monitoring System (IMS) of seismic, hydroacoustic, infrasonic and radionuclide sensors is operational, and the data from the IMS is analysed by the International Data Centre (IDC). The IDC provides CTBT signatories basic seismic event parameters and a screening analysis indicating whether an event exhibits explosion characteristics (for example, shallow depth). An important component of the screening analysis is a statistical test of the null hypothesis H 0: explosion characteristics using empirical measurements of seismic energy (magnitudes). The established magnitude used for event size is the body-wave magnitude (denoted m b) computed from the initial segment of a seismic waveform. IDC screening analysis is applied to events with m b greater than 3.5. The Rayleigh wave magnitude (denoted M S) is a measure of later arriving surface wave energy. Magnitudes are measurements of seismic energy that include adjustments (physical correction model) for path and distance effects between event and station. Relative to m b, earthquakes generally have a larger M S magnitude than explosions. This article proposes a hypothesis test (screening analysis) using M S and m b that expressly accounts for physical correction model inadequacy in the standard error of the test statistic. With this hypothesis test formulation, the 2009 Democratic Peoples Republic of Korea announced nuclear weapon test fails to reject the null hypothesis H 0: explosion characteristics.

  8. Slope Stability Analysis In Seismic Areas Of The Northern Apennines (Italy)

    SciTech Connect

    Lo Presti, D.; Fontana, T.; Marchetti, D.

    2008-07-08

    Several research works have been published on the slope stability in the northern Tuscany (central Italy) and particularly in the seismic areas of Garfagnana and Lunigiana (Lucca and Massa-Carrara districts), aimed at analysing the slope stability under static and dynamic conditions and mapping the landslide hazard. In addition, in situ and laboratory investigations are available for the study area, thanks to the activities undertaken by the Tuscany Seismic Survey. Based on such a huge information the co-seismic stability of few ideal slope profiles have been analysed by means of Limit equilibrium method LEM - (pseudo-static) and Newmark sliding block analysis (pseudo-dynamic). The analysis--results gave indications about the most appropriate seismic coefficient to be used in pseudo-static analysis after establishing allowable permanent displacement. Such indications are commented in the light of the Italian and European prescriptions for seismic stability analysis with pseudo-static approach. The stability conditions, obtained from the previous analyses, could be used to define microzonation criteria for the study area.

  9. Spatiotemporal sequence of Himalayan debris flow from analysis of high-frequency seismic noise

    NASA Astrophysics Data System (ADS)

    Burtin, A.; Bollinger, L.; Cattin, R.; Vergne, J.; Nábělek, J. L.

    2009-10-01

    During the 2003 summer monsoon, the Hi-CLIMB seismological stations deployed across the Himalayan Range detected bursts of high-frequency seismic noise that lasted several hours to days. On the basis of the cross correlation of seismic envelopes recorded at 11 stations, we show that the largest transient event on 15 August was located nearby a village partially destroyed on that day by a devastating debris flow. This consistency in both space and time suggests that high-frequency seismic noise analysis can be used to monitor debris flow generation as well as the evacuation of the sediment. A systematic study of one year of seismic noise, focusing on the detection of similar events, provides information on the spatial and temporal occurrence of mass movements at the front of the Himalayas. With a 50% probability of occurrence of a daily event, a total of 46 debris flows are seismically detected. Most of them were generated in regions of steep slopes, large gullies, and loose soils during the 2003 summer monsoon storms. These events are compared to local meteorological data to determine rainfall thresholds for slope failures, including the cumulative rainfall needed to bring the soil moisture content to failure capacity. The inferred thresholds are consistent with previous estimates deduced from soil studies as well as sediment supply investigations in the area. These results point out the potential of using seismic noise as a dedicated tool for monitoring the spatiotemporal occurrence of landslides and debris flows on a regional scale.

  10. Numerical simulation of bubble plumes and an analysis of their seismic attributes

    NASA Astrophysics Data System (ADS)

    Li, Canping; Gou, Limin; You, Jiachun

    2017-04-01

    To study the bubble plume's seismic response characteristics, the model of a plume water body has been built in this article using the bubble-contained medium acoustic velocity model and the stochastic medium theory based on an analysis of both the acoustic characteristics of a bubble-contained water body and the actual features of a plume. The finite difference method is used for forward modelling, and the single-shot seismic record exhibits the characteristics of a scattered wave field generated by a plume. A meaningful conclusion is obtained by extracting seismic attributes from the pre-stack shot gather record of a plume. The values of the amplitude-related seismic attributes increase greatly as the bubble content goes up, and changes in bubble radius will not cause seismic attributes to change, which is primarily observed because the bubble content has a strong impact on the plume's acoustic velocity, while the bubble radius has a weak impact on the acoustic velocity. The above conclusion provides a theoretical reference for identifying hydrate plumes using seismic methods and contributes to further study on hydrate decomposition and migration, as well as on distribution of the methane bubble in seawater.

  11. SHAKING TABLE TEST AND EFFECTIVE STRESS ANALYSIS ON SEISMIC PERFORMANCE WITH SEISMIC ISOLATION RUBBER TO THE INTERMEDIATE PART OF PILE FOUNDATION IN LIQUEFACTION

    NASA Astrophysics Data System (ADS)

    Uno, Kunihiko; Otsuka, Hisanori; Mitou, Masaaki

    The pile foundation is heavily damaged at the boundary division of the ground types, liquefied ground and non-liquefied ground, during an earthquake and there is a possibility of the collapse of the piles. In this study, we conduct a shaking table test and effective stress analysis of the influence of soil liquefaction and the seismic inertial force exerted on the pile foundation. When the intermediate part of the pile, there is at the boundary division, is subjected to section force, this part increases in size as compared to the pile head in certain instances. Further, we develop a seismic resistance method for a pile foundation in liquefaction using seismic isolation rubber and it is shown the middle part seismic isolation system is very effective.

  12. A non extensive statistical physics analysis of the Hellenic subduction zone seismicity

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Papadakis, G.; Michas, G.; Sammonds, P.

    2012-04-01

    The Hellenic subduction zone is the most seismically active region in Europe [Becker & Meier, 2010]. The spatial and temporal distribution of seismicity as well as the analysis of the magnitude distribution of earthquakes concerning the Hellenic subduction zone, has been studied using the concept of Non-Extensive Statistical Physics (NESP) [Tsallis, 1988 ; Tsallis, 2009]. Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems (Vallianatos, 2011). Using this concept, Abe & Suzuki (2003;2005) investigated the spatial and temporal properties of the seismicity in California and Japan and recently Darooneh & Dadashinia (2008) in Iran. Furthermore, Telesca (2011) calculated the thermodynamic parameter q of the magnitude distribution of earthquakes of the southern California earthquake catalogue. Using the external seismic zones of 36 seismic sources of shallow earthquakes in the Aegean and the surrounding area [Papazachos, 1990], we formed a dataset concerning the seismicity of shallow earthquakes (focal depth ≤ 60km) of the subduction zone, which is based on the instrumental data of the Geodynamic Institute of the National Observatory of Athens (http://www.gein.noa.gr/, period 1990-2011). The catalogue consists of 12800 seismic events which correspond to 15 polygons of the aforementioned external seismic zones. These polygons define the subduction zone, as they are associated with the compressional stress field which characterizes a subducting regime. For each event, moment magnitude was calculated from ML according to the suggestions of Papazachos et al. (1997). The cumulative distribution functions of the inter-event times and the inter-event distances as well as the magnitude distribution for each seismic zone have been estimated, presenting a variation in the q-triplet along the Hellenic subduction zone. The models used, fit rather well to the observed

  13. GIS analysis of changes in ecological vulnerability using a SPCA model in the Loess plateau of Northern Shaanxi, China.

    PubMed

    Hou, Kang; Li, Xuxiang; Zhang, Jing

    2015-04-17

    Changes in ecological vulnerability were analyzed for Northern Shaanxi, China using a geographic information system (GIS). An evaluation model was developed using a spatial principal component analysis (SPCA) model containing land use, soil erosion, topography, climate, vegetation and social economy variables. Using this model, an ecological vulnerability index was computed for the research region. Using natural breaks classification (NBC), the evaluation results were divided into five types: potential, slight, light, medium and heavy. The results indicate that there is greater than average optimism about the conditions of the study region, and the ecological vulnerability index (EVI) of the southern eight counties is lower than that of the northern twelve counties. From 1997 to 2011, the ecological vulnerability index gradually decreased, which means that environmental security was gradually enhanced, although there are still some places that have gradually deteriorated over the past 15 years. In the study area, government and economic factors and precipitation are the main reasons for the changes in ecological vulnerability.

  14. Region-specific deterministic and probabilistic seismic hazard analysis of Kanpur city

    NASA Astrophysics Data System (ADS)

    P, Anbazhagan; Bajaj, Ketan; Dutta, Nairwita; R Moustafa, Sayed S.; N Al-Arifi, Nassir S.

    2017-02-01

    A seismic hazard map of Kanpur city has been developed considering the region-specific seismotectonic parameters within a 500-km radius by deterministic and probabilistic approaches. The maximum probable earthquake magnitude ( M max) for each seismic source has been estimated by considering the regional rupture characteristics method and has been compared with the maximum magnitude observed ({M_{max }^{ {obs}}} ), M_{max }^{ {obs}} +0.5 and Kijko method. The best suitable ground motion prediction equations (GMPE) were selected from 27 applicable GMPEs based on the `efficacy test'. Furthermore, different weight factors were assigned to different M max values and the selected GMPE to calculate the final hazard value. Peak ground acceleration and spectral acceleration at 0.2 and 1 s were estimated and mapped for worst-case scenario and 2 and 10% probability of exceedance for 50 years. Peak ground acceleration (PGA) showed a variation from 0.04 to 0.36 g for DSHA, from 0.02 to 0.32 g and 0.092 to 0.1525 g for 2 and 10% probability in 50 years, respectively. A normalised site-specific design spectrum has been developed considering three vulnerable sources based on deaggregation at the city center and the results are compared with the recent 2011 Sikkim and 2015 Nepal earthquakes, and the Indian seismic code IS 1893.

  15. Exposure and Vulnerability Geospatial Analysis Using Earth Observation Data in the City of Liege, Belgium

    NASA Astrophysics Data System (ADS)

    Stephenne, N.; Beaumont, B.; Hallot, E.; Lenartz, F.; Lefebre, F.; Lauwaet, D.; Poelmans, L.; Wolff, E.

    2017-05-01

    Risk situation can be mitigated by prevention measures, early warning tools and adequate monitoring of past experiences where Earth Observation and geospatial analysis have an adding value. This paper discusses the potential use of Earth Observation data and especially Land Cover / Land Use map in addressing within the three aspects of the risk assessment: danger, exposure and vulnerability. Evidences of the harmful effects of air pollution or heat waves are widely admitted and should increase in the context of global warming. Moreover, urban areas are generally warmer than rural surroundings, the so-called urban heat island. Combined with in-situ measurements, this paper presents models of city or local climate (air pollution and urban heat island), with a resolution of less than one kilometer, developed by integrating several sources of information including Earth Observation data and in particular Land Cover / Land Use. This assessment of the danger is then be related to a map of exposure and vulnerable people. Using dasymetric method to disaggregate statistical information on Land Cover / Land Use data, the SmartPop project analyzes in parallel the map of danger with the maps of people exposure A special focus on some categories at risk such as the elderly has been proposed by Aubrecht and Ozceylan (2013). Perspectives of the project includes the integration of a new Land Cover / Land Use map in the danger, exposure and vulnerability models and proposition of several aspects of risk assessment with the stakeholders of Wallonia.

  16. A Spatiotemporal Analysis of Extreme Heat Vulnerability Across the United States using Geospatial Techniques

    NASA Astrophysics Data System (ADS)

    Schoessow, F. S.; Li, Y.; Howe, P. D.

    2016-12-01

    Extreme heat events are the deadliest natural hazard in the United States and are expected to increase in both severity and frequency in the coming years due to the effects of climate change. The risks of climate change and weather-related events such as heat waves to a population can be more comprehensively assessed by coupling the traditional examination of natural hazards using remote sensing and geospatial analysis techniques with human vulnerability factors and individual perceptions of hazards. By analyzing remote-sensed and empirical survey data alongside national hazards advisories, this study endeavors to establish a nationally-representative baseline quantifying the spatiotemporal variation of individual heat vulnerabilities at multiple scales and between disparate population groups affected by their unique socioenvironmental factors. This is of immediate academic interest because the study of heat waves risk perceptions remains relatively unexplored - despite the intensification of extreme heat events. The use of "human sensors", georeferenced & timestamped individual response data, provides invaluable contextualized data at a high spatial resolution, which will enable policy-makers to more effectively implement targeted strategies for risk prevention, mitigation, and communication. As climate change risks are further defined, this cognizance will help identify vulnerable populations and enhance national hazard preparedness and recovery frameworks.

  17. Seismic tomography and MASW as a tools improving Imaging - uncertainty analysis.

    NASA Astrophysics Data System (ADS)

    Marciniak, Artur; Majdański, Mariusz

    2017-04-01

    In recent years, near surface seismic imaging become topic of interest for geoengineers and geologists. In connection with other seismic methods like MASW and travel time tomography, seismic imaging can provide more complete model of shallow structures with analysis of uncertainty. Often forgotten, uncertainty analysis provide useful information for data interpretation, reducing possibility of mistakes in model applied projects. Moreover, application of different methods provide complete utilization of acquired data for in-depth interpretation, or with possibility to solve problems in other surveys. Applying different processing methods for the same raw data allowed authors to receive more accurate final result, with uncertainty analysis based on more complete dataset in comparison to the classical survey scheme.

  18. Preliminary Analysis of Remote Triggered Seismicity in Northern Baja California Generated by the 2011, Tohoku-Oki, Japan Earthquake

    NASA Astrophysics Data System (ADS)

    Wong-Ortega, V.; Castro, R. R.; Gonzalez-Huizar, H.; Velasco, A. A.

    2013-05-01

    We analyze possible variations of seismicity in the northern Baja California due to the passage of seismic waves from the 2011, M9.0, Tohoku-Oki, Japan earthquake. The northwestern area of Baja California is characterized by a mountain range composed of crystalline rocks. These Peninsular Ranges of Baja California exhibits high microseismic activity and moderate size earthquakes. In the eastern region of Baja California shearing between the Pacific and the North American plates takes place and the Imperial and Cerro-Prieto faults generate most of the seismicity. The seismicity in these regions is monitored by the seismic network RESNOM operated by the Centro de Investigación Científica y de Educación Superior de Ensenada (CICESE). This network consists of 13 three-component seismic stations. We use the seismic catalog of RESNOM to search for changes in local seismic rates occurred after the passing of surface waves generated by the Tohoku-Oki, Japan earthquake. When we compare one month of seismicity before and after the M9.0 earthquake, the preliminary analysis shows absence of triggered seismicity in the northern Peninsular Ranges and an increase of seismicity south of the Mexicali valley where the Imperial fault jumps southwest and the Cerro Prieto fault continues.

  19. Seismic fragility analysis of typical pre-1990 bridges due to near- and far-field ground motions

    NASA Astrophysics Data System (ADS)

    Mosleh, Araliya; Razzaghi, Mehran S.; Jara, José; Varum, Humberto

    2016-03-01

    Bridge damages during the past earthquakes caused several physical and economic impacts to transportation systems. Many of the existing bridges in earthquake prone areas are pre-1990 bridges and were designed with out of date regulation codes. The occurrences of strong motions in different parts of the world show every year the vulnerability of these structures. Nonlinear dynamic time history analyses were conducted to assess the seismic vulnerability of typical pre-1990 bridges. A family of existing concrete bridge representative of the most common bridges in the highway system in Iran is studied. The seismic demand consists in a set of far-field and near-field strong motions to evaluate the likelihood of exceeding the seismic capacity of the mentioned bridges. The peak ground accelerations (PGAs) were scaled and applied incrementally to the 3D models to evaluate the seismic performance of the bridges. The superstructure was assumed to remain elastic and the nonlinear behavior in piers was modeled by assigning plastic hinges in columns. In this study the displacement ductility and the PGA are selected as a seismic performance indicator and intensity measure, respectively. The results show that pre-1990 bridges subjected to near-fault ground motions reach minor and moderate damage states.

  20. Real time magma transport imaging and earthquake localization using seismic amplitude ratio analysis

    NASA Astrophysics Data System (ADS)

    Taisne, B.; Brenguier, F.; Nercessian, A.; Beauducel, F.; Smith, P. J.

    2011-12-01

    Seismic amplitude ratio analysis (SARA) has been used successfully to track the sub-surface migration of magma prior to an eruption at Piton de la Fournaise volcano, La Réunion. The methodology is based on the temporal analysis of the seismic amplitude ratio between different pairs of stations, along with a model of seismic wave attenuation. This method has already highlighted the complexity of magma migration in the shallower part of the volcanic edifice during a seismic crisis using continuous records. We will see that this method can also be applied to the localization of individual earthquakes triggered by monitoring systems, prior to human intervention such as phase picking. As examples, the analysis is performed on two kinds of seismic events observed at Soufrière Hills Volcano, Montserrat during the last 15 years, namely: Hybrids events and Volcano-Tectonic earthquakes. Finally, we present the implementation of a fully automatic SARA method for monitoring of Piton de la Fournaise volcano using continuous data in real-time.

  1. China's water resources vulnerability: A spatio-temporal analysis during 2003-2013

    NASA Astrophysics Data System (ADS)

    Cai, J.; Varis, O.; Yin, H.

    2015-12-01

    The present highly serious situation of China's water environment and aquatic ecosystems has occurred in the context of its stunning socioeconomic development over the past several decades. Therefore, an analysis with a high spatio-temporal resolution of the vulnerability assessment of water resources (VAWR) in China is burningly needed. However, to our knowledge, the temporal analysis of VAWR has been not yet addressed. Consequently, we performed, for the first time, a comprehensive spatio-temporal analysis of China's water resources vulnerability (WRV), using a composite index approach with an array of aspects highlighting key challenges that China's water resources system is nowadays facing. During our study period of 2003-2013, the political weight of China's integrated water resources management has been increasing continuously. Hence, it is essential and significant, based on the historical socioeconomic changes influenced by water-environment policy making and implementation, to reveal China's WRV for pinpointing key challenges to the healthy functionality of its water resources system. The water resources system in North and Central Coast appeared more vulnerable than that in Western China. China's water use efficiency has grown substantially over the study period, and so is water supply and sanitation coverage. In contrast, water pollution has been worsening remarkably in most parts of China, and so have water scarcity and shortage in the most stressed parts of the country. This spatio-temporal analysis implies that the key challenges to China's water resources system not only root in the geographical mismatch between socioeconomic development (e.g. water demand) and water resources endowments (e.g. water resources availability), but also stem from the intertwinement between socioeconomic development and national strategic policy making.

  2. Application and Validation of a GIS Model for Local Tsunami Vulnerability and Mortality Risk Analysis

    NASA Astrophysics Data System (ADS)

    Harbitz, C. B.; Frauenfelder, R.; Kaiser, G.; Glimsdal, S.; Sverdrup-thygeson, K.; Løvholt, F.; Gruenburg, L.; Mc Adoo, B. G.

    2015-12-01

    The 2011 Tōhoku tsunami caused a high number of fatalities and massive destruction. Data collected after the event allow for retrospective analyses. Since 2009, NGI has developed a generic GIS model for local analyses of tsunami vulnerability and mortality risk. The mortality risk convolves the hazard, exposure, and vulnerability. The hazard is represented by the maximum tsunami flow depth (with a corresponding likelihood), the exposure is described by the population density in time and space, while the vulnerability is expressed by the probability of being killed as a function of flow depth and building class. The analysis is further based on high-resolution DEMs. Normally a certain tsunami scenario with a corresponding return period is applied for vulnerability and mortality risk analysis. Hence, the model was first employed for a tsunami forecast scenario affecting Bridgetown, Barbados, and further developed in a forecast study for the city of Batangas in the Philippines. Subsequently, the model was tested by hindcasting the 2009 South Pacific tsunami in American Samoa. This hindcast was based on post-tsunami information. The GIS model was adapted for optimal use of the available data and successfully estimated the degree of mortality.For further validation and development, the model was recently applied in the RAPSODI project for hindcasting the 2011 Tōhoku tsunami in Sendai and Ishinomaki. With reasonable choices of building vulnerability, the estimated expected number of fatalities agree well with the reported death toll. The results of the mortality hindcast for the 2011 Tōhoku tsunami substantiate that the GIS model can help to identify high tsunami mortality risk areas, as well as identify the main risk drivers.The research leading to these results has received funding from CONCERT-Japan Joint Call on Efficient Energy Storage and Distribution/Resilience against Disasters (http://www.concertjapan.eu; project RAPSODI - Risk Assessment and design of

  3. A Comparative Analysis of Disaster Risk, Vulnerability and Resilience Composite Indicators

    PubMed Central

    Beccari, Benjamin

    2016-01-01

    related to the social environment, 25% to the disaster environment, 20% to the economic environment, 13% to the built environment, 6% to the natural environment and 3% were other indices. However variables specifically measuring action to mitigate or prepare for disasters only comprised 12%, on average, of the total number of variables in each index. Only 19% of methodologies employed any sensitivity or uncertainty analysis and in only a single case was this comprehensive. Discussion: A number of potential limitations of the present state of practice and how these might impact on decision makers are discussed. In particular the limited deployment of sensitivity and uncertainty analysis and the low use of direct measures of disaster risk, vulnerability and resilience could significantly limit the quality and reliability of existing methodologies. Recommendations for improvements to indicator development and use are made, as well as suggested future research directions to enhance the theoretical and empirical knowledge base for composite indicator development. PMID:27066298

  4. Geo-ethical dimension of community's safety: rural and urban population vulnerability analysis methodology

    NASA Astrophysics Data System (ADS)

    Kostyuchenko, Yuriy; Movchan, Dmytro; Kopachevsky, Ivan; Yuschenko, Maxim

    2016-04-01

    Modern world based on relations more than on causalities, so communicative, socio-economic, and socio-cultural issues are important to understand nature of risks and to make correct, ethical decisions. Today major part of risk analysts declared new nature of modern risks. We faced coherent or systemic risks, realization of which leads to domino effect, unexpected growing of losses and fatalities. This type of risks originated by complicated nature of heterogeneous environment, close interconnection of engineering networks, and changing structure of society. Heterogeneous multi-agent environment generates systemic risks, which requires analyze multi-source data with sophisticated tools. Formal basis for analysis of this type of risks is developed during last 5-7 years. But issues of social fairness, ethics, and education require further development. One aspect of analysis of social issues of risk management is studied in this paper. Formal algorithm for quantitative analysis of multi-source data analysis is proposed. As it was demonstrated, using proposed methodological base and the algorithm, it is possible to obtain regularized spatial-temporal distribution of investigated parameters over whole observation period with rectified reliability and controlled uncertainty. The result of disaster data analysis demonstrates that about half of direct disaster damage might be caused by social factors: education, experience and social behaviour. Using data presented also possible to estimate quantitative parameters of the losses distributions: a relation between education, age, experience, and losses; as well as vulnerability (in terms of probable damage) toward financial status in current social density. It is demonstrated that on wide-scale range an education determines risk perception and so vulnerability of societies. But on the local level there are important heterogeneities. Land-use and urbanization structure influencing to vulnerability essentially. The way to

  5. An interdisciplinary perspective on social and physical determinants of seismic risk

    NASA Astrophysics Data System (ADS)

    Lin, K.-H. E.; Chang, Y.-C.; Liu, G.-Y.; Chan, C.-H.; Lin, T.-H.; Yeh, C.-H.

    2015-10-01

    While disaster studies researchers usually view risk as a function of hazard, exposure, and vulnerability, few studies have systematically examined the relationships among the various physical and socioeconomic determinants underlying disasters, and fewer have done so through seismic risk analysis. In the context of the 1999 Chi-Chi earthquake in Taiwan, this study constructs three statistical models to test different determinants that affect disaster fatality at the village level, including seismic hazard, exposure of population and fragile buildings, and demographic and socioeconomic vulnerability. The Poisson regression model is used to estimate the impact of these factors on fatalities. Research results indicate that although all of the determinants have an impact on seismic fatality, some indicators of vulnerability, such as gender ratio, percentages of young and aged population, income and its standard deviation, are the important determinants deteriorating seismic risk. These findings have strong social implications for policy interventions to mitigate such disasters.

  6. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM.

    SciTech Connect

    DEGRASSI,G.; HOFMAYER,C.; MURPHY,C.; SUZUKI,K.; NAMITA,Y.

    2003-08-17

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper.

  7. BNL NONLINEAR PRE TEST SEISMIC ANALYSIS FOR THE NUPEC ULTIMATE STRENGTH PIPING TEST PROGRAM.

    SciTech Connect

    DEGRASSI,G.; HOFMAYER,C.; MURPHY,C.; SUZUKI,K.; NAMITA,Y.

    2003-08-17

    The Nuclear Power Engineering Corporation (NUPEC) of Japan has been conducting a multi-year research program to investigate the behavior of nuclear power plant piping systems under large seismic loads. The objectives of the program are: to develop a better understanding of the elasto-plastic response and ultimate strength of nuclear piping; to ascertain the seismic safety margin of current piping design codes; and to assess new piping code allowable stress rules. Under this program, NUPEC has performed a large-scale seismic proving test of a representative nuclear power plant piping system. In support of the proving test, a series of materials tests, static and dynamic piping component tests, and seismic tests of simplified piping systems have also been performed. As part of collaborative efforts between the United States and Japan on seismic issues, the US Nuclear Regulatory Commission (USNRC) and its contractor, the Brookhaven National Laboratory (BNL), are participating in this research program by performing pre-test and post-test analyses, and by evaluating the significance of the program results with regard to safety margins. This paper describes BNL's pre-test analysis to predict the elasto-plastic response for one of NUPEC's simplified piping system seismic tests. The capability to simulate the anticipated ratcheting response of the system was of particular interest. Analyses were performed using classical bilinear and multilinear kinematic hardening models as well as a nonlinear kinematic hardening model. Comparisons of analysis results for each plasticity model against test results for a static cycling elbow component test and for a simplified piping system seismic test are presented in the paper.

  8. Statistical analysis in the Natural Time Domain of the seismicity in México

    NASA Astrophysics Data System (ADS)

    Ramírez-Rojas, A.; Moreno-Torres, L. R.; Flores-Marquez, E. L.

    2012-04-01

    This work deals in a statistical analysis performed in the natural time domain of the seismicity occurred in México within the period 2000-2011. The data set corresponds to the seismic activity recorded by the National Seismological Service (SSN). Our study is performed along the Mexican Pacific coast comprising the states of: Baja California, Jalisco, Michoacán, Guerrero, and Oaxaca. The preliminary results of the analysis of power spectrum, order parameter and entropy fluctuations in the natural time domain show good consistency with the natural time theory and the latest tectonic reported findings; this fact confirms the suspicions of different tectonic mechanisms as seism trigger.

  9. Probability problems in seismic risk analysis and load combinations for nuclear power plants

    SciTech Connect

    George, L.L.

    1983-01-01

    This workshop describes some probability problems in power plant reliability and maintenance analysis. The problems are seismic risk analysis, loss of load probability, load combinations, and load sharing. The seismic risk problem is to compute power plant reliability given an earthquake and the resulting risk. Component survival occurs if its peak random response to the earthquake does not exceed its strength. Power plant survival is a complicated Boolean function of component failures and survivals. The responses and strengths of components are dependent random processes, and the peak responses are maxima of random processes. The resulting risk is the expected cost of power plant failure.

  10. What meta-analysis can tell us about vulnerability of marine biodiversity to ocean acidification?

    NASA Astrophysics Data System (ADS)

    Dupont, S.; Dorey, N.; Thorndyke, M.

    2010-09-01

    Ocean acidification has been proposed as a major threat for marine biodiversity. Hendriks et al. [Hendriks, I.E., Duarte, C.M., Alvarez, M., 2010. Vulnerability of marine biodiversity to ocean acidification: a meta-analysis. Estuarine, Coastal and Shelf Science, doi:10.1016/j.ecss.2009.11.022.] proposed an alternative view and suggested, based on a meta-analysis, that marine biota may be far more resistant to ocean acidification than hitherto believed. However, such a meta-analytical approach can mask more subtle features, for example differing sensitivities during the life-cycle of an organism. Using a similar metric on an echinoderm database, we show that key bottlenecks present in the life-cycle (e.g. larvae being more vulnerable than adults) and responsible for driving the whole species response may be hidden in a global meta-analysis. Our data illustrate that any ecological meta-analysis should be hypothesis driven, taking into account the complexity of biological systems, including all life-cycle stages and key biological processes. Available data allow us to conclude that near-future ocean acidification can/will have dramatic negative impact on some marine species, including echinoderms, with likely consequences at the ecosystem level.

  11. Seismic isolation of an electron microscope

    SciTech Connect

    Godden, W.G.; Aslam, M.; Scalise, D.T.

    1980-01-01

    A unique two-stage dynamic-isolation problem is presented by the conflicting design requirements for the foundations of an electron microscope in a seismic region. Under normal operational conditions the microscope must be isolated from ambient ground noise; this creates a system extremely vulnerable to seismic ground motions. Under earthquake loading the internal equipment forces must be limited to prevent damage or collapse. An analysis of the proposed design solution is presented. This study was motivated by the 1.5 MeV High Voltage Electron Microscope (HVEM) to be installed at the Lawrence Berkeley Laboratory (LBL) located near the Hayward Fault in California.

  12. Orienting Ocean Bottom Seismic Sensors from Ship Noise Polarization Analysis

    NASA Astrophysics Data System (ADS)

    Barruol, Guilhem; Dreo, Richard; Fontaine, Fabrice R.; Scholz, John R.; Sigloch, Karin; Geay, Bruno; Bouillon, Alexandre

    2017-04-01

    For the RHUM-RUM project (Réunion Hotspot and Upper Mantle - Réunions Unterer Mantel, www.rhum-rum.net), a network of 57 ocean-bottom seismometers (OBS) was installed on the ocean floor around La Réunion Island in the SW Indian Ocean. Part of the network happened to be located beneath a route of heavy ship traffic connecting SE-Asia and the South-Atlantic region. We analysed the ship noise recorded on the OBS and show that it can be used for determining the horizontal orientations of the seismic instruments as they were recording on the ocean floor. The OBS, provided by the German DEPAS and the French INSU OBS national pools, were equipped with wide-band or broad-band three-components seismic and hydro-acoustic sensors. They were deployed in Nov. 2012 by R/V Marion Dufresne and recovered by R/V Meteor one year later. Depending on the configuration, the OBS recorded for 8 to 13 months. By combining the trajectories of passing ships - provided by AIS (Automatic Identification system) GPS data - with our geophysical data recorded on the ocean floor, we show that both hydro-acoustic and seismic spectral analyses exhibit clear signals associated with vessels between 1 and 50 Hz, in the high-frequency range of our instruments. Large cargo vessels are detected several hours before and after their closest point of approach (CPA) and show clear Doppler effects which put quantitative constraints on their distances and speeds. By analysing the continuous noise polarization on the three seismic components, we show that the polarization of the noise emitted by ships passing in the neighbourhood of an ocean-bottom seismometer can be used for retrieving the orientation of the OBS horizontal components on the ocean floor with respect to the geographic reference frame. We find good agreement between OBS orientations thus calculated from ship noise and the OBS orientations determined independently from teleseismic body and surface wave polarization methods (Scholz et al., GJI

  13. Statistical analysis and modeling of seismicity related to the exploitation of geothermal energy

    NASA Astrophysics Data System (ADS)

    Dinske, Carsten; Langenbruch, Cornelius; Shapiro, Serge

    2016-04-01

    catalogs of the considered reservoirs contain approximately 50 per cent of the number of events in the original catalogs. Furthermore, we perform ETAS modeling (Epidemic Type Aftershock model, Ogata, 1985,1988) for two reasons. First, we want to understand if the different reservoirs are also comparable in the earthquake interaction patterns and hence in the aftershock triggering following larger magnitude induced events. Second, if we identify systematic patterns, the ETAS modeling can contribute to the forecast and consequently to the mitigation of seismicity during production of geothermal energy. We find that stationary ETAS models can not accurately capture the observed seismicity rate changes. One reason for this finding is given by the rate of induced events (or the back-ground activity in the ETAS model) which is not constant with time. Therefore we apply non-stationary ETAS modeling which results in a good agreement between observation and model. However, the needed non-stationarity in the process complicates the application of ETAS modeling for the forecast of seismicity during production. Thus, its implementation in so-called traffic-light-systems for the mitigation of possible seismic hazard requires further detailed analysis.

  14. Use of cartography in historical seismicity analysis: a reliable tool to better apprehend the contextualization of the historical documents

    NASA Astrophysics Data System (ADS)

    Thibault, Fradet; Grégory, Quenet; Kevin, Manchuel

    2014-05-01

    Historical studies, including historical seismicity analysis, deal with historical documents. Numerous factors, such as culture, social condition, demography, political situations and opinions or religious ones influence the way the events are transcribed in the archives. As a consequence, it is crucial to contextualize and compare the historical documents reporting on a given event in order to reduce the uncertainties affecting their analysis and interpretation. When studying historical seismic events it is often tricky to have a global view of all the information provided by the historical documents. It is also difficult to extract cross-correlated information from the documents and draw a precise historical context. Use of cartographic and geographic tools in GIS software is the best tool for the synthesis, interpretation and contextualization of the historical material. The main goal is to produce the most complete dataset of available information, in order to take into account all the components of the historical context and consequently improve the macroseismic analysis. The Entre-Deux-Mers earthquake (1759, Iepc= VII-VIII) [SISFRANCE 2013 - EDF-IRSN-BRGM] is well documented but has never benefited from a cross-analysis of historical documents and historical context elements. The map of available intensity data from SISFRANCE highlights a gap in macroseismic information within the estimated epicentral area. The aim of this study is to understand the origin of this gap by making a cartographic compilation of both, archive information and historical context elements. The results support the hypothesis that the lack of documents and macroseismic data in the epicentral area is related to a low human activity rather than low seismic effects in this zone. Topographic features, geographical position, flood hazard, roads and pathways locations, vineyards distribution and the forester coverage, mentioned in the archives and reported on the Cassini's map confirm this

  15. Improving resolution of crosswell seismic section based on time-frequency analysis

    SciTech Connect

    Luo, H.; Li, Y.

    1994-12-31

    According to signal theory, to improve resolution of seismic section is to extend high-frequency band of seismic signal. In cross-well section, sonic log can be regarded as a reliable source providing high-frequency information to the trace near the borehole. In such case, what to do is to introduce this high-frequency information into the whole section. However, neither traditional deconvolution algorithms nor some new inversion methods such as BCI (Broad Constraint Inversion) are satisfied because of high-frequency noise and nonuniqueness of inversion results respectively. To overcome their disadvantages, this paper presents a new algorithm based on Time-Frequency Analysis (TFA) technology which has been increasingly received much attention as an useful signal analysis too. Practical applications show that the new method is a stable scheme to improve resolution of cross-well seismic section greatly without decreasing Signal to Noise Ratio (SNR).

  16. Dynamics of the Bingham Canyon Mine landslides from seismic signal analysis

    NASA Astrophysics Data System (ADS)

    Hibert, Clément; Ekström, Göran; Stark, Colin P.

    2014-07-01

    Joint interpretation of long- and short-period seismic signals generated by landslides sheds light on the dynamics of slope failure, providing constraints on landslide initiation and termination and on the main phases of acceleration and deceleration. We carry out a combined analysis of the seismic signals generated by two massive landslides that struck the Bingham Canyon Mine pit on 10 April 2013. Inversion of the long-period waveforms yields time series for the bulk landslide forces and momenta, from which we deduce runout trajectories consistent with the deposit morphology. Comparing these time series with the short-period seismic data, we are able to infer when and where major changes take place in landslide momentum along the runout path. This combined analysis points to a progressive fracturing of the masses during acceleration indicates that deceleration starts the moment they reach the pit floor and suggests that the bulk movement is stopped by a topographic barrier.

  17. 3D seismic data de-noising and reconstruction using Multichannel Time Slice Singular Spectrum Analysis

    NASA Astrophysics Data System (ADS)

    Rekapalli, Rajesh; Tiwari, R. K.; Sen, Mrinal K.; Vedanti, Nimisha

    2017-05-01

    Noises and data gaps complicate the seismic data processing and subsequently cause difficulties in the geological interpretation. We discuss a recent development and application of the Multi-channel Time Slice Singular Spectrum Analysis (MTSSSA) for 3D seismic data de-noising in time domain. In addition, L1 norm based simultaneous data gap filling of 3D seismic data using MTSSSA also discussed. We discriminated the noises from single individual time slices of 3D volumes by analyzing Eigen triplets of the trajectory matrix. We first tested the efficacy of the method on 3D synthetic seismic data contaminated with noise and then applied to the post stack seismic reflection data acquired from the Sleipner CO2 storage site (pre and post CO2 injection) from Norway. Our analysis suggests that the MTSSSA algorithm is efficient to enhance the S/N for better identification of amplitude anomalies along with simultaneous data gap filling. The bright spots identified in the de-noised data indicate upward migration of CO2 towards the top of the Utsira formation. The reflections identified applying MTSSSA to pre and post injection data correlate well with the geology of the Southern Viking Graben (SVG).

  18. Seismic fragility evaluation of a piping system in a nuclear power plant by shaking table test and numerical analysis

    SciTech Connect

    Kim, M. K.; Kim, J. H.; Choi, I. K.

    2012-07-01

    In this study, a seismic fragility evaluation of the piping system in a nuclear power plant was performed. For the evaluation of seismic fragility of the piping system, this research was progressed as three steps. At first, several piping element capacity tests were performed. The monotonic and cyclic loading tests were conducted under the same internal pressure level of actual nuclear power plants to evaluate the performance. The cracks and wall thinning were considered as degradation factors of the piping system. Second, a shaking tale test was performed for an evaluation of seismic capacity of a selected piping system. The multi-support seismic excitation was performed for the considering a difference of an elevation of support. Finally, a numerical analysis was performed for the assessment of seismic fragility of piping system. As a result, a seismic fragility for piping system of NPP in Korea by using a shaking table test and numerical analysis. (authors)

  19. Accuracy of three-dimensional seismic ground response analysis in time domain using nonlinear numerical simulations

    NASA Astrophysics Data System (ADS)

    Liang, Fayun; Chen, Haibing; Huang, Maosong

    2017-07-01

    To provide appropriate uses of nonlinear ground response analysis for engineering practice, a three-dimensional soil column with a distributed mass system and a time domain numerical analysis were implemented on the OpenSees simulation platform. The standard mesh of a three-dimensional soil column was suggested to be satisfied with the specified maximum frequency. The layered soil column was divided into multiple sub-soils with a different viscous damping matrix according to the shear velocities as the soil properties were significantly different. It was necessary to use a combination of other one-dimensional or three-dimensional nonlinear seismic ground analysis programs to confirm the applicability of nonlinear seismic ground motion response analysis procedures in soft soil or for strong earthquakes. The accuracy of the three-dimensional soil column finite element method was verified by dynamic centrifuge model testing under different peak accelerations of the earthquake. As a result, nonlinear seismic ground motion response analysis procedures were improved in this study. The accuracy and efficiency of the three-dimensional seismic ground response analysis can be adapted to the requirements of engineering practice.

  20. Magma replenishment and volcanic unrest inferred from the analysis of VT micro-seismicity and seismic velocity changes at Piton de la Fournaise Volcano

    NASA Astrophysics Data System (ADS)

    Brenguier, F.; Rivemale, E.; Clarke, D. S.; Schmid, A.; Got, J.; Battaglia, J.; Taisne, B.; Staudacher, T.; Peltier, A.; Shapiro, N. M.; Tait, S.; Ferrazzini, V.; Di Muro, A.

    2011-12-01

    allow magma to reach the edifice summit. Moreover, we have identified transient seismic velocity changes lasting a few weeks that could be associated with unreported lateral magma intrusions not leading to eruptions. The clustering of pre-eruptive micro-seismicity between mid 1999-2003 shows that seismic events repeat over successive seismic swarms and suggests that the magma pathway is spatially separated from the seismic faults. Also, the inversion for focal mechanisms shows dominant sub-horizontal P-axes indicating that part of the pre-eruptive micro-seismicity is due to the horizontal compressive stress induced by magma injection. Finally, the analysis of long-term GPS data recorded on the edifice flank shows a constant lateral displacement rate of 3.5 cm/year. More work will be needed in order to infer the possible mutual interactions between magma unrest and transport and the large-scale deformation of the edifice flank.

  1. Analysis of subsalt induced seismicity in the Netherlands

    NASA Astrophysics Data System (ADS)

    Kraaijpoel, D.; Dost, B.

    2012-04-01

    A number of natural gas fields in the North of the Netherlands show moderate seismicity induced by gas extraction. The gas reservoirs are located underneath a thick layer of Zechstein evaporites (salt). The presence of the salt has two important effects on the wave motions of induced events as observed at the surface close to the epicenter. The first effect is the defocusing of seismic energy with its consequences for observed amplitudes and radiation patterns. The second effect is the relatively strong conversion from P- to S-energy at the bottom of the salt leading to the presence of S-wave precursors. Failure to recognize these effects may lead to misinterpretation of source location and mechanism. Moreover, the S-wave precursors provide a handle to reduce uncertainty in depth estimation. We investigate the effects using a number of strong motion records measured at short epicentral distances for some of the stronger recents events (M2.0-3.5) in the Groningen field.

  2. Optimization Strategies for the Vulnerability Analysis of the Electric Power Grid

    SciTech Connect

    Pinar, A.; Meza, J.; Donde, V.; Lesieutre, B.

    2007-11-13

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (MINLP) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  3. Optimization strategies for the vulnerability analysis of the electric power grid.

    SciTech Connect

    Meza, Juan C.; Pinar, Ali; Lesieutre, Bernard; Donde, Vaibhav

    2009-03-01

    Identifying small groups of lines, whose removal would cause a severe blackout, is critical for the secure operation of the electric power grid. We show how power grid vulnerability analysis can be studied as a mixed integer nonlinear programming (minlp) problem. Our analysis reveals a special structure in the formulation that can be exploited to avoid nonlinearity and approximate the original problem as a pure combinatorial problem. The key new observation behind our analysis is the correspondence between the Jacobian matrix (a representation of the feasibility boundary of the equations that describe the flow of power in the network) and the Laplacian matrix in spectral graph theory (a representation of the graph of the power grid). The reduced combinatorial problem is known as the network inhibition problem, for which we present a mixed integer linear programming formulation. Our experiments on benchmark power grids show that the reduced combinatorial model provides an accurate approximation, to enable vulnerability analyses of real-sized problems with more than 10,000 power lines.

  4. Analysis of the seismicity in the region of Mirovo salt mine after 8 years monitoring

    NASA Astrophysics Data System (ADS)

    Dimitrova, Liliya; Solakov, Dimcho; Simeonova, Stela; Aleksandrova, Irena; Georgieva, Gergana

    2015-04-01

    Mirovo salt deposit is situated in the NE part of Bulgaria and 5 kilometers away from the town of Provadiya. The mine is in operation since 1956. The salt is produced by dilution and extraction of the brine to the surface. A system of chambers-pillars is formed within the salt body as a result of the applied technology. The mine is situated in a seismically quiet part of the state. The region is characterized with complex geological structure and several faults. During the last 3 decades a large number of small and moderate earthquakes (M<4.5) are realized in the close vicinity of the salt deposit. Local seismological network (LSN) is deployed in the region to monitor the local seismicity. It consists of 6 three component digital stations. A real-time data transfer from LSN stations to National Data Center (in Sofia) is implemented using the VPN and MAN networks of the Bulgarian Telecommunication Company. Common processing and interpretation of the data from LSN and the national seismic network is performed. Real-time and interactive data processing are performed by the Seismic Network Data Processor (SNDP) software package. More than 700 earthquakes are registered by the LSN within 30km region around the mine during the 8 years monitoring. First we processed the data and compile a catalogue of the earthquakes occur within the studied region (30km around the salt mine). Spatial pattern of seismicity is analyzed. A large number of the seismic events occurred within the northern and north-western part of the salt body. Several earthquakes occurred in close vicinity of the mine. Concerning that the earthquakes could be tectonic and/or induced an attempt is made to find criteria to distinguish natural from induced seismicity. To characterize and distinguish the main processes active in the area we also made waveform and spectral analysis of a number of earthquakes.

  5. Effects of surface topography on ground shaking prediction: implications for seismic hazard analysis and recommendations for seismic design

    NASA Astrophysics Data System (ADS)

    Barani, Simone; Massa, Marco; Lovati, Sara; Spallarossa, Daniele

    2014-06-01

    This study examines the role of topographic effects on the prediction of earthquake ground motion. Ground motion prediction equations (GMPEs) are mathematical models that estimate the shaking level induced by an earthquake as a function of several parameters, such as magnitude, source-to-site distance, style of faulting and ground type. However, little importance is given to the effects of topography, which, as known, may play a significant role on the level, duration and frequency content of ground motion. Ridges and crests are often lost inside the large number of sites considered in the definition of a GMPE. Hence, it is presumable that current GMPEs are unable to accurately predict the shaking level at the top of a relief. The present work, which follows the article of Massa et al. about topographic effects, aims at overcoming this limitation by amending an existing GMPE with an additional term to account for the effects of surface topography at a specific site. First, experimental ground motion values and ground motions predicted by the attenuation model of Bindi et al. for five case studies are compared and contrasted in order to quantify their discrepancy and to identify anomalous behaviours of the sites investigated. Secondly, for the site of Narni (Central Italy), amplification factors derived from experimental measurements and numerical analyses are compared and contrasted, pointing out their impact on probabilistic seismic hazard analysis and design norms. In particular, with reference to the Italian building code, our results have highlighted the inadequacy of the national provisions concerning the definition of the seismic load at top of ridges and crests, evidencing a significant underestimation of ground motion around the site resonance frequency.

  6. Research on the spatial analysis method of seismic hazard for island

    NASA Astrophysics Data System (ADS)

    Jia, Jing; Jiang, Jitong; Zheng, Qiuhong; Gao, Huiying

    2017-05-01

    Seismic hazard analysis(SHA) is a key component of earthquake disaster prevention field for island engineering, whose result could provide parameters for seismic design microscopically and also is the requisite work for the island conservation planning’s earthquake and comprehensive disaster prevention planning macroscopically, in the exploitation and construction process of both inhabited and uninhabited islands. The existing seismic hazard analysis methods are compared in their application, and their application and limitation for island is analysed. Then a specialized spatial analysis method of seismic hazard for island (SAMSHI) is given to support the further related work of earthquake disaster prevention planning, based on spatial analysis tools in GIS and fuzzy comprehensive evaluation model. The basic spatial database of SAMSHI includes faults data, historical earthquake record data, geological data and Bouguer gravity anomalies data, which are the data sources for the 11 indices of the fuzzy comprehensive evaluation model, and these indices are calculated by the spatial analysis model constructed in ArcGIS’s Model Builder platform.

  7. The Effect Analysis of Strain Rate on Power Transmission Tower-Line System under Seismic Excitation

    PubMed Central

    Wang, Wenming

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system. PMID:25105157

  8. The effect analysis of strain rate on power transmission tower-line system under seismic excitation.

    PubMed

    Tian, Li; Wang, Wenming; Qian, Hui

    2014-01-01

    The effect analysis of strain rate on power transmission tower-line system under seismic excitation is studied in this paper. A three-dimensional finite element model of a transmission tower-line system is created based on a real project. Using theoretical analysis and numerical simulation, incremental dynamic analysis of the power transmission tower-line system is conducted to investigate the effect of strain rate on the nonlinear responses of the transmission tower and line. The results show that the effect of strain rate on the transmission tower generally decreases the maximum top displacements, but it would increase the maximum base shear forces, and thus it is necessary to consider the effect of strain rate on the seismic analysis of the transmission tower. The effect of strain rate could be ignored for the seismic analysis of the conductors and ground lines, but the responses of the ground lines considering strain rate effect are larger than those of the conductors. The results could provide a reference for the seismic design of the transmission tower-line system.

  9. Magma migration at the onset of the 2012-13 Tolbachik eruption revealed by Seismic Amplitude Ratio Analysis

    NASA Astrophysics Data System (ADS)

    Caudron, Corentin; Taisne, Benoit; Kugaenko, Yulia; Saltykov, Vadim

    2015-12-01

    In contrast of the 1975-76 Tolbachik eruption, the 2012-13 Tolbachik eruption was not preceded by any striking change in seismic activity. By processing the Klyuchevskoy volcano group seismic data with the Seismic Amplitude Ratio Analysis (SARA) method, we gain insights into the dynamics of magma movement prior to this important eruption. A clear seismic migration within the seismic swarm, started 20 hours before the reported eruption onset (05:15 UTC, 26 November 2012). This migration proceeded in different phases and ended when eruptive tremor, corresponding to lava flows, was recorded (at ~ 11:00 UTC, 27 November 2012). In order to get a first order approximation of the magma location, we compare the calculated seismic intensity ratios with the theoretical ones. As expected, the observations suggest that the seismicity migrated toward the eruption location. However, we explain the pre-eruptive observed ratios by a vertical migration under the northern slope of Plosky Tolbachik volcano followed by a lateral migration toward the eruptive vents. Another migration is also captured by this technique and coincides with a seismic swarm that started 16-20 km to the south of Plosky Tolbachik at 20:31 UTC on November 28 and lasted for more than 2 days. This seismic swarm is very similar to the seismicity preceding the 1975-76 Tolbachik eruption and can be considered as a possible aborted eruption.

  10. Vulnerability of Buildings for Tbilisi City

    NASA Astrophysics Data System (ADS)

    Vepkhvadze, Sopio; Arabidze, Vakhtang; Arevadze, Nika; Mukhadze, Temur; Jangveladze, Shota

    2013-04-01

    The risk always exists when cities are built on. Population growth in cities and urbanization in seismic-prone zones leads to infrastructure expansion. The goal of the society is to construct earthquake resistant infrastructure and minimize the expected losses. Work presented here was initiated by working package wp5 of regional projects EMME (Earthquake Model for Middle East Region). The primary scientific objective of this work was to combine analysis of the contemporary elements at risk inventories, seismicity and vulnerability to assess seismic hazard and seismic risk for the 0capital of Georgia - Tbilisi. Creation data bases (inventory) of elements at risk (building population) in GIS system were the first step of this work. Creation inventory databases are based on two approaches. One is monitoring and second is analyses of photos and aerial photos by expert. During the monitoring were realized that we have many cases of roof types, materials and functionality. For roof type, materials and functionality special program was prepared in GIS that allow manually create these ones in databases and then assigned to the building. Depending the choice of these ones, the program automatically assigned code to the building, finely on the bases of this codes program will be prepared that automatically calculate the taxonomy of the building. The European building taxonomy classification proposed in Giovinazzi (2005) were used for these building and taxonomy classification was done. On the bases of empirical data that was collected for Racha earthquake (Ms = 6.9) on 29 April of 1991 and Tbilisi earthquake (Ms= 4.5) on 25 April of 2002 some intensity based vulnerability study were completed and the regional vulnerability factors were developed for these typologies.

  11. First seismic shear wave velocity profile of the lunar crust as extracted from the Apollo 17 active seismic data by wavefield gradient analysis

    NASA Astrophysics Data System (ADS)

    Sollberger, David; Schmelzbach, Cedric; Robertsson, Johan O. A.; Greenhalgh, Stewart A.; Nakamura, Yosio; Khan, Amir

    2016-04-01

    We present a new seismic velocity model of the shallow lunar crust, including, for the first time, shear wave velocity information. So far, the shear wave velocity structure of the lunar near-surface was effectively unconstrained due to the complexity of lunar seismograms. Intense scattering and low attenuation in the lunar crust lead to characteristic long-duration reverberations on the seismograms. The reverberations obscure later arriving shear waves and mode conversions, rendering them impossible to identify and analyze. Additionally, only vertical component data were recorded during the Apollo active seismic experiments, which further compromises the identification of shear waves. We applied a novel processing and analysis technique to the data of the Apollo 17 lunar seismic profiling experiment (LSPE), which involved recording seismic energy generated by several explosive packages on a small areal array of four vertical component geophones. Our approach is based on the analysis of the spatial gradients of the seismic wavefield and yields key parameters such as apparent phase velocity and rotational ground motion as a function of time (depth), which cannot be obtained through conventional seismic data analysis. These new observables significantly enhance the data for interpretation of the recorded seismic wavefield and allow, for example, for the identification of S wave arrivals based on their lower apparent phase velocities and distinct higher amount of generated rotational motion relative to compressional (P-) waves. Using our methodology, we successfully identified pure-mode and mode-converted refracted shear wave arrivals in the complex LSPE data and derived a P- and S-wave velocity model of the shallow lunar crust at the Apollo 17 landing site. The extracted elastic-parameter model supports the current understanding of the lunar near-surface structure, suggesting a thin layer of low-velocity lunar regolith overlying a heavily fractured crust of basaltic

  12. Analysis of the seismic performance of isolated buildings according to life-cycle cost.

    PubMed

    Dang, Yu; Han, Jian-Ping; Li, Yong-Tao

    2015-01-01

    This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment.

  13. Analysis of the Seismic Performance of Isolated Buildings according to Life-Cycle Cost

    PubMed Central

    Dang, Yu; Han, Jian-ping; Li, Yong-tao

    2015-01-01

    This paper proposes an indicator of seismic performance based on life-cycle cost of a building. It is expressed as a ratio of lifetime damage loss to life-cycle cost and determines the seismic performance of isolated buildings. Major factors are considered, including uncertainty in hazard demand and structural capacity, initial costs, and expected loss during earthquakes. Thus, a high indicator value indicates poor building seismic performance. Moreover, random vibration analysis is conducted to measure structural reliability and evaluate the expected loss and life-cycle cost of isolated buildings. The expected loss of an actual, seven-story isolated hospital building is only 37% of that of a fixed-base building. Furthermore, the indicator of the structural seismic performance of the isolated building is much lower in value than that of the structural seismic performance of the fixed-base building. Therefore, isolated buildings are safer and less risky than fixed-base buildings. The indicator based on life-cycle cost assists owners and engineers in making investment decisions in consideration of structural design, construction, and expected loss. It also helps optimize the balance between building reliability and building investment. PMID:25653677

  14. Analysis of Dynamic Insertion of Control Rod of BWR under Seismic Excitation

    NASA Astrophysics Data System (ADS)

    Koide, Yuichi; Nakagawa, Masaki; Fukushi, Naoki; Ishigaki, Hirokuni; Okumura, Kazue

    The dynamic characteristics of control rod for boiling water reactor being inserted under seismic excitation were investigated using non-linear analytical models. The capability of managing the insertion of control rod is one of the most important factors affecting the safety of nuclear power plant undergoing seismic events. Predicting the behavior of control rod being inserted during earthquakes is important when designing how rod should be controlled during seismic events. We developed analytical models using the finite element method (FEM). The effect of the interaction force between the control rod and the fuel assemblies is considered in non-linear analysis. This interaction force causes resistance force to be applied to the control rod when they are being inserted. The validity of the analytical models was confirmed by comparing the analytical results with the experimental ones. The effects of input seismic motion and structural parameters on the insertion time ware investigated using the analytical models. These analytical methods can be used to predict the time to insert the control rod into the core region of reactor, and are useful for designing control rod system that can survive seismic events.

  15. Unique problems associated with seismic analysis of partially gas-saturated unconsolidated sediments

    USGS Publications Warehouse

    Lee, M.W.; Collett, T.S.

    2009-01-01

    Gas hydrate stability conditions restrict the occurrence of gas hydrate to unconsolidated and high water-content sediments at shallow depths. Because of these host sediments properties, seismic and well log data acquired for the detection of free gas and associated gas hydrate-bearing sediments often require nonconventional analysis. For example, a conventional method of identifying free gas using the compressional/shear-wave velocity (Vp/Vs) ratio at the logging frequency will not work, unless the free-gas saturations are more than about 40%. The P-wave velocity dispersion of partially gas-saturated sediments causes a problem in interpreting well log velocities and seismic data. Using the White, J.E. [1975. Computed seismic speeds and attenuation in rocks with partial gas saturation. Geophysics 40, 224-232] model for partially gas-saturated sediments, the difference between well log and seismic velocities can be reconciled. The inclusion of P-wave velocity dispersion in interpreting well log data is, therefore, essential to identify free gas and to tie surface seismic data to synthetic seismograms.

  16. Data Analysis of Seismic Sequence in Central Italy in 2016 using CTBTO- International Monitoring System

    NASA Astrophysics Data System (ADS)

    Mumladze, Tea; Wang, Haijun; Graham, Gerhard

    2017-04-01

    The seismic network that forms the International Monitoring System (IMS) of the Comprehensive Nuclear-test-ban Treaty Organization (CTBTO) will ultimately consist of 170 seismic stations (50 primary and 120 auxiliary) in 76 countries around the world. The Network is still under the development, but currently more than 80% of the network is in operation. The objective of seismic monitoring is to detect and locate underground nuclear explosions. However, the data from the IMS also can be widely used for scientific and civil purposes. In this study we present the results of data analysis of the seismic sequence in 2016 in Central Italy. Several hundred earthquakes were recorded for this sequence by the seismic stations of the IMS. All events were accurately located the analysts of the International Data Centre (IDC) of the CTBTO. In this study we will present the epicentral and magnitude distribution, station recordings and teleseismic phases as obtained from the Reviewed Event Bulletin (REB). We will also present a comparison of the database of the IDC with the databases of the European-Mediterranean Seismological Centre (EMSC) and U.S. Geological Survey (USGS). Present work shows that IMS data can be used for earthquake sequence analyses and can play an important role in seismological research.

  17. Singular spectrum analysis and its applications in mapping mantle seismic structure

    NASA Astrophysics Data System (ADS)

    Dokht, Ramin M. H.; Gu, Yu Jeffrey; Sacchi, Mauricio D.

    2017-03-01

    Seismic discontinuities are fundamental to the understanding of mantle composition and dynamics. Their depths and impedance contrasts are generally determined using secondary phases such as SS precursors and P-to-S converted waves. However, analysing and interpreting these weak signals often suffer from incomplete data coverage, high noise levels and interfering seismic arrivals, especially near tectonically complex regions such as subduction zones. To overcome these pitfalls, we adopt a singular spectrum analysis (SSA) method to remove random noise, reconstruct missing traces and enhance the robustness of SS precursors and P-to-S conversions from mantle seismic discontinuities. Our method takes advantage of the predictability of time series in the frequency-space domain and performs rank reduction using a singular value decomposition of the trajectory matrix. We apply SSA to synthetic record sections as well as the observations of (1) SS precursors beneath the northwestern Pacific subduction zones, and (2) P-to-S converted waves from southwestern Canada. In comparison with raw or interpolated data, the SSA enhanced seismic sections exhibit greater resolution due to the suppression of random noise (which reduces signal amplitude during standard averaging procedures) through rank reduction. SSA also enables an effective separation of the SS precursors from the postcursors of S-wave core diffractions. This method will greatly benefit future analyses of weak crustal and mantle seismic phases, especially when data coverages are less than ideal.

  18. Areal distribution of sedimentary facies determined from seismic facies analysis and models of modern depositional systems

    SciTech Connect

    Seramur, K.C.; Powell, R.D.; Carpenter, P.J.

    1988-02-01

    Seismic facies analysis was applied to 3.5-kHz single-channel analog reflection profiles of the sediment fill within Muir Inlet, Glacier Bay, southeast Alaska. Nine sedimentary facies have been interpreted from seven seismic facies identified on the profiles. The interpretations are based on reflection characteristics and structural features of the seismic facies. The following reflection characteristics and structural features are used: reflector spacing, amplitude and continuity of reflections, internal reflection configurations, attitude of reflection terminations at a facies boundary, body geometry of a facies, and the architectural associations of seismic facies within each basin. The depositional systems are reconstructed by determining the paleotopography, bedding patterns, sedimentary facies, and modes of deposition within the basin. Muir Inlet is a recently deglaciated fjord for which successive glacier terminus positions and consequent rates of glacial retreat are known. In this environment the depositional processes and sediment characteristics vary with distance from a glacier terminus, such that during a retreat a record of these variations is preserved in the aggrading sediment fill. Sedimentary facies within the basins of lower Muir Inlet are correlated with observed depositional processes near the present glacier terminus in the upper inlet. The areal distribution of sedimentary facies within the basins is interpreted using the seismic facies architecture and inferences from known sediment characteristics proximal to present glacier termini.

  19. Assessing the need for an update of a probabilistic seismic hazard analysis using a SSHAC Level 1 study and the Seismic Hazard Periodic Reevaluation Methodology

    DOE PAGES

    Payne, Suzette J.; Coppersmith, Kevin J.; Coppersmith, Ryan; ...

    2017-08-23

    A key decision for nuclear facilities is evaluating the need for an update of an existing seismic hazard analysis in light of new data and information that has become available since the time that the analysis was completed. We introduce the newly developed risk-informed Seismic Hazard Periodic Review Methodology (referred to as the SHPRM) and present how a Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 probabilistic seismic hazard analysis (PSHA) was performed in an implementation of this new methodology. The SHPRM offers a defensible and documented approach that considers both the changes in seismic hazard and engineering-based risk informationmore » of an existing nuclear facility to assess the need for an update of an existing PSHA. The SHPRM has seven evaluation criteria that are employed at specific analysis, decision, and comparison points which are applied to seismic design categories established for nuclear facilities in United States. The SHPRM is implemented using a SSHAC Level 1 study performed for the Idaho National Laboratory, USA. The implementation focuses on the first six of the seven evaluation criteria of the SHPRM which are all provided from the SSHAC Level 1 PSHA. Finally, to illustrate outcomes of the SHPRM that do not lead to the need for an update and those that do, the example implementations of the SHPRM are performed for nuclear facilities that have target performance goals expressed as the mean annual frequency of unacceptable performance at 1x10-4, 4x10-5 and 1x10-5.« less

  20. Azimuthal anisotropy analysis using P-wave multiazimuth seismic data in Rock Springs Uplift, Wyoming, US

    NASA Astrophysics Data System (ADS)

    Skelly, Klint T.

    Coal is an important source of energy, but combustion of coal releases a significant amount of carbon dioxide (CO2) into the atmosphere. Consequently, developing efficient carbon capture and sequestration strategies to mitigate global warming is of great practical significance. Characterization of reservoirs proposed for carbon capture and sequestration is important for efficient injection of CO2 and monitoring reservoir performance over time. The efficiency and long term effectiveness of CO2 storage is largely governed by the presence and orientation of fractures within a reservoir and its associated seal. The presence of natural fractures which can act as conduits for CO2 leakage gives rise to seismic anisotropy that is related to the fracture orientation and fracture density, and this relation can be studied through anisotropy analysis. Estimation of fracture orientation and fracture density is essential for long term CO 2 storage and monitoring. Well logs, cores and well tests provide information about stress fields and fractures at the well location but away from the well one has to rely on seismic data. Seismic-derived attributes like semblance and curvature provide useful tools for qualitative analysis of fractures, but they do not provide a direct measure of fracture orientation and fracture density. Moreover, such analyses depend on the quality of stacked seismic data. Multiazimuth seismic data, on the other hand, provide information about the variations in the seismic velocity in different azimuths and can thus provide a direct estimate of fracture orientation and fracture density. This research, which focus on the Rock Springs Uplift, Wyoming, USA, used single component (P-wave) multiazimuth seismic data and well data to create flattened angle gathers for different azimuths using prestack waveform inversion. Here, an advanced waveform technique, prestack waveform inversion, was used to obtain suitable velocities for proper offset-to-angle conversion as

  1. Genetic analysis reveals demographic fragmentation of grizzly bears yielding vulnerably small populations.

    PubMed

    Proctor, Michael F; McLellan, Bruce N; Strobeck, Curtis; Barclay, Robert M R

    2005-11-22

    Ecosystem conservation requires the presence of native carnivores, yet in North America, the distributions of many larger carnivores have contracted. Large carnivores live at low densities and require large areas to thrive at the population level. Therefore, if human-dominated landscapes fragment remaining carnivore populations, small and demographically vulnerable populations may result. Grizzly bear range contraction in the conterminous USA has left four fragmented populations, three of which remain along the Canada-USA border. A tenet of grizzly bear conservation is that the viability of these populations requires demographic linkage (i.e. inter-population movement of both sexes) to Canadian bears. Using individual-based genetic analysis, our results suggest this demographic connection has been severed across their entire range in southern Canada by a highway and associated settlements, limiting female and reducing male movement. Two resulting populations are vulnerably small (< or =100 animals) and one of these is completely isolated. Our results suggest that these trans-border bear populations may be more threatened than previously thought and that conservation efforts must expand to include international connectivity management. They also demonstrate the ability of genetic analysis to detect gender-specific demographic population fragmentation in recently disturbed systems, a traditionally intractable yet increasingly important ecological measurement worldwide.

  2. Risk and Vulnerability Analysis of Satellites Due to MM/SD with PIRAT

    NASA Astrophysics Data System (ADS)

    Kempf, Scott; Schafer, Frank Rudolph, Martin; Welty, Nathan; Donath, Therese; Destefanis, Roberto; Grassi, Lilith; Janovsky, Rolf; Evans, Leanne; Winterboer, Arne

    2013-08-01

    Until recently, the state-of-the-art assessment of the threat posed to spacecraft by micrometeoroids and space debris was limited to the application of ballistic limit equations to the outer hull of a spacecraft. The probability of no penetration (PNP) is acceptable for assessing the risk and vulnerability of manned space mission, however, for unmanned missions, whereby penetrations of the spacecraft exterior do not necessarily constitute satellite or mission failure, these values are overly conservative. The newly developed software tool PIRAT (Particle Impact Risk and Vulnerability Analysis Tool) has been developed based on the Schäfer-Ryan-Lambert (SRL) triple-wall ballistic limit equation (BLE), applicable for various satellite components. As a result, it has become possible to assess the individual failure rates of satellite components. This paper demonstrates the modeling of an example satellite, the performance of a PIRAT analysis and the potential for subsequent design optimizations with respect of micrometeoroid and space debris (MM/SD) impact risk.

  3. Genetic analysis reveals demographic fragmentation of grizzly bears yielding vulnerably small populations

    PubMed Central

    Proctor, Michael F; McLellan, Bruce N; Strobeck, Curtis; Barclay, Robert M.R

    2005-01-01

    Ecosystem conservation requires the presence of native carnivores, yet in North America, the distributions of many larger carnivores have contracted. Large carnivores live at low densities and require large areas to thrive at the population level. Therefore, if human-dominated landscapes fragment remaining carnivore populations, small and demographically vulnerable populations may result. Grizzly bear range contraction in the conterminous USA has left four fragmented populations, three of which remain along the Canada–USA border. A tenet of grizzly bear conservation is that the viability of these populations requires demographic linkage (i.e. inter-population movement of both sexes) to Canadian bears. Using individual-based genetic analysis, our results suggest this demographic connection has been severed across their entire range in southern Canada by a highway and associated settlements, limiting female and reducing male movement. Two resulting populations are vulnerably small (≤100 animals) and one of these is completely isolated. Our results suggest that these trans-border bear populations may be more threatened than previously thought and that conservation efforts must expand to include international connectivity management. They also demonstrate the ability of genetic analysis to detect gender-specific demographic population fragmentation in recently disturbed systems, a traditionally intractable yet increasingly important ecological measurement worldwide. PMID:16243699

  4. Temporal pattern in Corinth rift seismicity revealed by visibility graph analysis

    NASA Astrophysics Data System (ADS)

    Hloupis, George

    2017-10-01

    The investigation of complex time series properties through graph theoretical tools was greatly benefited from the recently developed method of visibility graph (VG) which acts as a hub between nonlinear dynamics, graph theory and time series analysis. In this work, earthquake time series, a representative example of a complex system, was studied by using VG method. The examined time series extracted from the Corinth rift seismic catalogue. By using a sliding window approach the temporal evolution of exponent γ of the degree distribution was studied. It was found that the time period of the most significant event (seismic swarm after major earthquake) in the examined seismic catalogue coincides with the time period where the exponent γ presents its minimum.

  5. Critical Soil-Structure Interaction Analysis Considerations for Seismic Qualification of Safety Equipment

    SciTech Connect

    Hossain, Q A

    2004-03-04

    While developing seismic analysis models for buildings that support safety-related equipment, a number of issues should be considered to ensure that the input motions for performing seismic qualification of safety-related equipment are properly defined. These considerations are listed and discussed here with special attention to the effect and importance of the interaction among the foundation soil, the building structure, the equipment anchors, and the equipment structure. Typical industry practices are critically examined to assess their adequacy for determining the input motions for equipment seismic qualification. The features that are considered essential in a soil-structure interaction (SSI) model are described. Also, the effects of inappropriate treatment or representation of these features are discussed.

  6. Improved moving window cross-spectral analysis for resolving large temporal seismic velocity changes in permafrost

    NASA Astrophysics Data System (ADS)

    James, S. R.; Knox, H. A.; Abbott, R. E.; Screaton, E. J.

    2017-05-01

    Cross correlations of seismic noise can potentially record large changes in subsurface velocity due to permafrost dynamics and be valuable for long-term Arctic monitoring. We applied seismic interferometry, using moving window cross-spectral analysis (MWCS), to 2 years of ambient noise data recorded in central Alaska to investigate whether seismic noise could be used to quantify relative velocity changes due to seasonal active-layer dynamics. The large velocity changes (>75%) between frozen and thawed soil caused prevalent cycle-skipping which made the method unusable in this setting. We developed an improved MWCS procedure which uses a moving reference to measure daily velocity variations that are then accumulated to recover the full seasonal change. This approach reduced cycle-skipping and recovered a seasonal trend that corresponded well with the timing of active-layer freeze and thaw. This improvement opens the possibility of measuring large velocity changes by using MWCS and permafrost monitoring by using ambient noise.

  7. An analysis of seismic risk from a tourism point of view.

    PubMed

    Mäntyniemi, Päivi

    2012-07-01

    Global awareness of natural calamities increased after the destructive Indian Ocean tsunami of December 2004, largely because many foreigners lost their lives, especially in Thailand. This paper explores how best to communicate the seismic risk posed by different travel destinations to crisis management personnel in tourists' home countries. The analysis of seismic risk should be straightforward enough for non-specialists, yet powerful enough to identify the travel destinations that are most at risk. The output for each location is a point in 3D space composed of the natural and built-up environment and local tourism. The tourism-specific factors can be tailored according to the tourists' nationality. The necessary information can be collected from various directories and statistics, much of it available over the Internet. The output helps to illustrate the overall seismic risk conditions of different travel destinations, allows for comparison across destinations, and identifies the places that are most at risk.

  8. Best Estimate Method vs Evaluation Method: a comparison of two techniques in evaluating seismic analysis and design

    SciTech Connect

    Bumpus, S.E.; Johnson, J.J.; Smith, P.D.

    1980-05-01

    The concept of how two techniques, Best Estimate Method and Evaluation Method, may be applied to the traditional seismic analysis and design of a nuclear power plant is introduced. Only the four links of the seismic analysis and design methodology chain (SMC) - seismic input, soil-structure interaction, major structural response, and subsystem response - are considered. The objective is to evaluate the compounding of conservatisms in the seismic analysis and design of nuclear power plants, to provide guidance for judgments in the SMC, and to concentrate the evaluation on that part of the seismic analysis and design which is familiar to the engineering community. An example applies the effects of three-dimensional excitations on a model of a nuclear power plant structure. The example demonstrates how conservatisms accrue by coupling two links in the SMC and comparing those results to the effects of one link alone. The utility of employing the Best Estimate Method vs the Evaluation Method is also demonstrated.

  9. An enhancement of NASTRAN for the seismic analysis of structures. [nuclear power plants

    NASA Technical Reports Server (NTRS)

    Burroughs, J. W.

    1980-01-01

    New modules, bulk data cards and DMAP sequence were added to NASTRAN to aid in the seismic analysis of nuclear power plant structures. These allow input consisting of acceleration time histories and result in the generation of acceleration floor response spectra. The resulting system contains numerous user convenience features, as well as being reasonably efficient.

  10. Children's Physical Resilience Outcomes: Meta-Analysis of Vulnerability and Protective Factors.

    PubMed

    Lavoie, Jennifer; Pereira, Liane C; Talwar, Victoria

    Resilience has generally been understood as positive coping and adaptation despite stress and adversity and as a buffer against stress. Researchers examining resilience have typically focused on children's psychological resilience because of the well-established impact of stress on children's mental health. However, although it has also been well-established that high levels of stress can impact children's physical health, their physical health has received little attention in resilience research. Articles were selected for review if they (1) had a variable that was in some way a measure of physical health in response to a psychosocial stressor; (2) had participants who were children or adolescents within the age range of 4-18years; and (3) were a peer-reviewed, empirical study. Two random-effect meta-analyses were conducted with a sample of 12,772 participants across 14 studies to determine the influence of protective and vulnerability factors on children's physical health in adverse experiences. Protective factors had a moderate effect and vulnerability factors had a small-moderate effect on health measures across domains of physiological, sleep behavior, and overall health. The type of health measure moderated the effect size for vulnerability factors, but not for protective factors. These findings suggest that protective factors may be associated with an environment that encourages children to thrive, as apparent by their physical health. The results of this review and meta-analysis can be used to guide the methodological design of future studies on childhood resilience and to inform clinical practice with children and adolescents. Copyright © 2016 Elsevier Inc. All rights reserved.

  11. Analysis of vulnerability factors that control nitrate occurrence in natural springs (Osona Region, NE Spain).

    PubMed

    Menció, Anna; Boy, Mercè; Mas-Pla, Josep

    2011-07-15

    Nitrate pollution is one of the main concerns of groundwater management in most of the world's agricultural areas. In the Osona region of NE Spain, high concentrations of nitrates have been reported in wells. This study uses the occurrence of this pollutant in natural springs as an indicator of the sub-surface dynamics of the water cycle and shows how groundwater quality is affected by crop fertilization, as an approach to determine the aquifer vulnerability. Nitrate concentration and other hydrochemical parameters based on a biannual database are reported for approximately 80 springs for the period 2004-2009. The background concentration of nitrate is first determined to distinguish polluted areas from natural nitrate occurrence. A statistical treatment using logistic regression and ANOVA is then performed to identify the significance of the effect of vulnerability factors such as the geological setting of the springs, land use in recharge areas, sampling periods, and chemical parameters like pH and EC, on groundwater nitrate pollution. The results of the analysis identify a threshold value of 7-8 mg NO(3)(-)/L for nitrate pollution in this area. Logistic regression and ANOVA results show that an increase in EC or a decrease in pH values is linked to the possibility of higher nitrate concentrations in springs. These analyses also show that nitrate pollution is more dependent on land use than the geological setting of springs or sampling periods. Indeed, the specific geological and soil features of the uppermost layers in their recharge areas do not contribute to the buffering of nitrate impacts on aquifers as measured in natural springs. Land use, and particularly fertilization practices, are major factors in groundwater vulnerability.

  12. Shallow prospect evaluation in Shahbazpur structure using seismic attributes analysis, Southern Bangladesh.

    NASA Astrophysics Data System (ADS)

    Rahman, M.

    2015-12-01

    Shahbazpur structure is located within the Hatia trough a southern extension of prolific Surma Basin, where lies all of the largest Gas fields of Bangladesh. A method is established to delineate the structural mapping precisely by interpreting four 2D seismic lines that are acquired over Shahbazpur structure. Moreover direct hydrocarbon indicators (DHI) related attributes analyzed for further confirmation of presence of hydrocarbon. To do this synthetic generation, seismic well tie, velocity modelling and depth conversion has been performed. Seismic attribute analysis used in this study is mostly related to bright spot identification in reservoir zones as well as to identify the similar response in both below and above of the reservoir zones. Seismic interpretation shows that Shahbazpur structure is roughly an oval shaped anticline with simple four way dip closure which will be a good trap for hydrocarbon accumulation. A limited number of seismic attributes functions that are available in an academic version of Petrel software are applied to analyze attributes. Taking in consideration of possible interpretation pitfalls, attributes analysis confirmed that bright spots exist in the shallower part of the structure above the present reservoir zones which might be a potential shallow gas reserve. The bright spots are located within Shahbazpur sequence I of Dupi Tila Group of Pleistocene age and Shahbazpur sequence II of Tipam Group of Pleistocene-Pliocene age. This signature will play a very important role in next well planning on the same structure to test the shallow accumulation of hydrocarbon. For better understanding of this shallow reserve, it is suggested to acquire 3D seismic data over Shahbazpur structure which will help to evaluate the hydrocarbon accumulation and to identify gas migration pathways.

  13. Hydra—The National Earthquake Information Center’s 24/7 seismic monitoring, analysis, catalog production, quality analysis, and special studies tool suite

    USGS Publications Warehouse

    Patton, John M.; Guy, Michelle R.; Benz, Harley M.; Buland, Raymond P.; Erickson, Brian K.; Kragness, David S.

    2016-08-18

    This report provides an overview of the capabilities and design of Hydra, the global seismic monitoring and analysis system used for earthquake response and catalog production at the U.S. Geological Survey National Earthquake Information Center (NEIC). Hydra supports the NEIC’s worldwide earthquake monitoring mission in areas such as seismic event detection, seismic data insertion and storage, seismic data processing and analysis, and seismic data output.The Hydra system automatically identifies seismic phase arrival times and detects the occurrence of earthquakes in near-real time. The system integrates and inserts parametric and waveform seismic data into discrete events in a database for analysis. Hydra computes seismic event parameters, including locations, multiple magnitudes, moment tensors, and depth estimates. Hydra supports the NEIC’s 24/7 analyst staff with a suite of seismic analysis graphical user interfaces.In addition to the NEIC’s monitoring needs, the system supports the processing of aftershock and temporary deployment data, and supports the NEIC’s quality assurance procedures. The Hydra system continues to be developed to expand its seismic analysis and monitoring capabilities.

  14. Application of Nonlinear Seismic Soil-Structure Interaction Analysis for Identification of Seismic Margins at Nuclear Power Plants

    SciTech Connect

    Varma, Amit H.; Seo, Jungil; Coleman, Justin Leigh

    2015-11-01

    Seismic probabilistic risk assessment (SPRA) methods and approaches at nuclear power plants (NPP) were first developed in the 1970s and aspects of them have matured over time as they were applied and incrementally improved. SPRA provides information on risk and risk insights and allows for some accounting for uncertainty and variability. As a result, SPRA is now used as an important basis for risk-informed decision making for both new and operating NPPs in the US and in an increasing number of countries globally. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures that can lead to a seismic induced core damage event. However, in some instances the current SPRA approach contains large uncertainties, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility). INL has an advanced SPRA research and development (R&D) activity that will identify areas in the calculation process that contain significant uncertainties. One current area of focus is the use of nonlinear soil-structure interaction (NLSSI) analysis methods to accurately capture: 1) nonlinear soil behavior and 2) gapping and sliding between the NPP and soil. The goal of this study is to compare numerical NLSSI analysis results with recorded earthquake ground motions at Fukushima Daichii (Great Tohuku Earthquake) and evaluate the sources of nonlinearity contributing to the observed reduction in peak acceleration. Comparisons are made using recorded data in the free-field (soil column with no structural influence) and recorded data on the NPP basemat (in-structure response). Results presented in this study should identify areas of focus for future R&D activities with the goal of minimizing uncertainty in SPRA calculations. This is not a validation activity since there are too many sources of uncertainty that a numerical analysis would need

  15. Periodicity of Strong Seismicity in Italy: Schuster Spectrum Analysis Extended to the Destructive Earthquakes of 2016

    NASA Astrophysics Data System (ADS)

    Bragato, P. L.

    2017-06-01

    The strong earthquakes that occurred in Italy between 2009 and 2016 represent an abrupt acceleration of seismicity in respect of the previous 30 years. Such behavior seems to agree with the periodic rate change I observed in a previous paper. The present work improves that study by extending the data set up to the end of 2016, adopting the latest version of the historical seismic catalog of Italy, and introducing Schuster spectrum analysis for the detection of the oscillatory period and the assessment of its statistical significance. Applied to the declustered catalog of M w ≥ 6 earthquakes that occurred between 1600 and 2016, the analysis individuates a marked periodicity of 46 years, which is recognized above the 95% confidence level. Monte Carlo simulation shows that the oscillatory behavior is stable in respect of random errors on magnitude estimation. A parametric oscillatory model for the annual rate of seismicity is estimated by likelihood maximization under the hypothesis of inhomogeneous Poisson point process. According to the Akaike Information Criterion, such model outperforms the simpler homogeneous one with constant annual rate. A further element emerges form the analysis: so far, despite recent earthquakes, the Italian seismicity is still within a long-term decreasing trend established since the first half of the twentieth century.

  16. Romanian Educational Seismic Network Project

    NASA Astrophysics Data System (ADS)

    Tataru, Dragos; Ionescu, Constantin; Zaharia, Bogdan; Grecu, Bogdan; Tibu, Speranta; Popa, Mihaela; Borleanu, Felix; Toma, Dragos; Brisan, Nicoleta; Georgescu, Emil-Sever; Dobre, Daniela; Dragomir, Claudiu-Sorin

    2013-04-01

    Romania is one of the most active seismic countries in Europe, with more than 500 earthquakes occurring every year. The seismic hazard of Romania is relatively high and thus understanding the earthquake phenomena and their effects at the earth surface represents an important step toward the education of population in earthquake affected regions of the country and aims to raise the awareness about the earthquake risk and possible mitigation actions. In this direction, the first national educational project in the field of seismology has recently started in Romania: the ROmanian EDUcational SEISmic NETwork (ROEDUSEIS-NET) project. It involves four partners: the National Institute for Earth Physics as coordinator, the National Institute for Research and Development in Construction, Urban Planning and Sustainable Spatial Development " URBAN - INCERC" Bucharest, the Babeş-Bolyai University (Faculty of Environmental Sciences and Engineering) and the software firm "BETA Software". The project has many educational, scientific and social goals. The main educational objectives are: training students and teachers in the analysis and interpretation of seismological data, preparing of several comprehensive educational materials, designing and testing didactic activities using informatics and web-oriented tools. The scientific objective is to introduce into schools the use of advanced instruments and experimental methods that are usually restricted to research laboratories, with the main product being the creation of an earthquake waveform archive. Thus a large amount of such data will be used by students and teachers for educational purposes. For the social objectives, the project represents an effective instrument for informing and creating an awareness of the seismic risk, for experimentation into the efficacy of scientific communication, and for an increase in the direct involvement of schools and the general public. A network of nine seismic stations with SEP seismometers

  17. Tracking Socioeconomic Vulnerability Using Network Analysis: Insights from an Avian Influenza Outbreak in an Ostrich Production Network

    PubMed Central

    Moore, Christine; Cumming, Graeme S.; Slingsby, Jasper; Grewar, John

    2014-01-01

    Background The focus of management in many complex systems is shifting towards facilitation, adaptation, building resilience, and reducing vulnerability. Resilience management requires the development and application of general heuristics and methods for tracking changes in both resilience and vulnerability. We explored the emergence of vulnerability in the South African domestic ostrich industry, an animal production system which typically involves 3–4 movements of each bird during its lifetime. This system has experienced several disease outbreaks, and the aim of this study was to investigate whether these movements have contributed to the vulnerability of this system to large disease outbreaks. Methodology/Principal Findings The ostrich production system requires numerous movements of birds between different farm types associated with growth (i.e. Hatchery to juvenile rearing farm to adult rearing farm). We used 5 years of movement records between 2005 and 2011 prior to an outbreak of Highly Pathogenic Avian Influenza (H5N2). These data were analyzed using a network analysis in which the farms were represented as nodes and the movements of birds as links. We tested the hypothesis that increasing economic efficiency in the domestic ostrich industry in South Africa made the system more vulnerable to outbreak of Highly Pathogenic Avian Influenza (H5N2). Our results indicated that as time progressed, the network became increasingly vulnerable to pathogen outbreaks. The farms that became infected during the outbreak displayed network qualities, such as significantly higher connectivity and centrality, which predisposed them to be more vulnerable to disease outbreak. Conclusions/Significance Taken in the context of previous research, our results provide strong support for the application of network analysis to track vulnerability, while also providing useful practical implications for system monitoring and management. PMID:24498004

  18. Vulnerabilities to Rock-Slope Failure Impacts from Christchurch, NZ Case History Analysis

    NASA Astrophysics Data System (ADS)

    Grant, A.; Wartman, J.; Massey, C. I.; Olsen, M. J.; Motley, M. R.; Hanson, D.; Henderson, J.

    2015-12-01

    Rock-slope failures during the 2010/11 Canterbury (Christchurch), New Zealand Earthquake Sequence resulted in 5 fatalities and caused an estimated US$400 million of damage to buildings and infrastructure. Reducing losses from rock-slope failures requires consideration of both hazard (i.e. likelihood of occurrence) and risk (i.e. likelihood of losses given an occurrence). Risk assessment thus requires information on the vulnerability of structures to rock or boulder impacts. Here we present 32 case histories of structures impacted by boulders triggered during the 2010/11 Canterbury earthquake sequence, in the Port Hills region of Christchurch, New Zealand. The consequences of rock fall impacts on structures, taken as penetration distance into structures, are shown to follow a power-law distribution with impact energy. Detailed mapping of rock fall sources and paths from field mapping, aerial lidar digital elevation model (DEM) data, and high-resolution aerial imagery produced 32 well-constrained runout paths of boulders that impacted structures. Impact velocities used for structural analysis were developed using lumped mass 2-D rock fall runout models using 1-m resolution lidar elevation data. Model inputs were based on calibrated surface parameters from mapped runout paths of 198 additional boulder runouts. Terrestrial lidar scans and structure from motion (SfM) imagery generated 3-D point cloud data used to measure structural damage and impacting boulders. Combining velocity distributions from 2-D analysis and high-precision boulder dimensions, kinetic energy distributions were calculated for all impacts. Calculated impact energy versus penetration distance for all cases suggests a power-law relationship between damage and impact energy. These case histories and resulting fragility curve should serve as a foundation for future risk analysis of rock fall hazards by linking vulnerability data to the predicted energy distributions from the hazard analysis.

  19. Arctic indigenous youth resilience and vulnerability: comparative analysis of adolescent experiences across five circumpolar communities.

    PubMed

    Ulturgasheva, Olga; Rasmus, Stacy; Wexler, Lisa; Nystad, Kristine; Kral, Michael

    2014-10-01

    Arctic peoples today find themselves on the front line of rapid environmental change brought about by globalizing forces, shifting climates, and destabilizing physical conditions. The weather is not the only thing undergoing rapid change here. Social climates are intrinsically connected to physical climates, and changes within each have profound effects on the daily life, health, and well-being of circumpolar indigenous peoples. This paper describes a collaborative effort between university researchers and community members from five indigenous communities in the circumpolar north aimed at comparing the experiences of indigenous Arctic youth in order to come up with a shared model of indigenous youth resilience. The discussion introduces a sliding scale model that emerged from the comparative data analysis. It illustrates how a "sliding scale" of resilience captures the inherent dynamism of youth strategies for "doing well" and what forces represent positive and negative influences that slide towards either personal and communal resilience or vulnerability. The model of the sliding scale is designed to reflect the contingency and interdependence of resilience and vulnerability and their fluctuations between lowest and highest points based on timing, local situation, larger context, and meaning.

  20. Low carbon technology performance vs infrastructure vulnerability: analysis through the local and global properties space.

    PubMed

    Dawson, David A; Purnell, Phil; Roelich, Katy; Busch, Jonathan; Steinberger, Julia K

    2014-11-04

    Renewable energy technologies, necessary for low-carbon infrastructure networks, are being adopted to help reduce fossil fuel dependence and meet carbon mitigation targets. The evolution of these technologies has progressed based on the enhancement of technology-specific performance criteria, without explicitly considering the wider system (global) impacts. This paper presents a methodology for simultaneously assessing local (technology) and global (infrastructure) performance, allowing key technological interventions to be evaluated with respect to their effect on the vulnerability of wider infrastructure systems. We use exposure of low carbon infrastructure to critical material supply disruption (criticality) to demonstrate the methodology. A series of local performance changes are analyzed; and by extension of this approach, a method for assessing the combined criticality of multiple materials for one specific technology is proposed. Via a case study of wind turbines at both the material (magnets) and technology (turbine generators) levels, we demonstrate that analysis of a given intervention at different levels can lead to differing conclusions regarding the effect on vulnerability. Infrastructure design decisions should take a systemic approach; without these multilevel considerations, strategic goals aimed to help meet low-carbon targets, that is, through long-term infrastructure transitions, could be significantly jeopardized.

  1. Socio-geographical factors in vulnerability to dengue in Thai villages: a spatial regression analysis.

    PubMed

    Tipayamongkholgul, Mathuros; Lisakulruk, Sunisa

    2011-05-01

    Focusing on the socio-geographical factors that influence local vulnerability to dengue at the village level, spatial regression methods were applied to analyse, over a 5-year period, the village-specific, cumulative incidence of all reported dengue cases among 437 villages in Prachuap Khiri Khan, a semi-urban province of Thailand. The K-order nearest neighbour method was used to define the range of neighbourhoods. Analysis showed a significant neighbourhood effect (ρ = 0.405, P <0.001), which implies that villages with geographical proximity shared a similar level of vulnerability to dengue. The two independent social factors, associated with a higher incidence of dengue, were a shorter distance to the nearest urban area (β = -0.133, P <0.05) and a smaller average family size (β = -0.102, P <0.05). These results indicate that the trend of increasing dengue occurrence in rural Thailand arose in areas under stronger urban influence rather than in remote rural areas.

  2. Seismic Risk Perception compared with seismic Risk Factors

    NASA Astrophysics Data System (ADS)

    Crescimbene, Massimo; La Longa, Federica; Pessina, Vera; Pino, Nicola Alessandro; Peruzza, Laura

    2016-04-01

    The communication of natural hazards and their consequences is one of the more relevant ethical issues faced by scientists. In the last years, social studies have provided evidence that risk communication is strongly influenced by the risk perception of people. In order to develop effective information and risk communication strategies, the perception of risks and the influencing factors should be known. A theory that offers an integrative approach to understanding and explaining risk perception is still missing. To explain risk perception, it is necessary to consider several perspectives: social, psychological and cultural perspectives and their interactions. This paper presents the results of the CATI survey on seismic risk perception in Italy, conducted by INGV researchers on funding by the DPC. We built a questionnaire to assess seismic risk perception, with a particular attention to compare hazard, vulnerability and exposure perception with the real data of the same factors. The Seismic Risk Perception Questionnaire (SRP-Q) is designed by semantic differential method, using opposite terms on a Likert scale to seven points. The questionnaire allows to obtain the scores of five risk indicators: Hazard, Exposure, Vulnerability, People and Community, Earthquake Phenomenon. The questionnaire was administered by telephone interview (C.A.T.I.) on a statistical sample at national level of over 4,000 people, in the period January -February 2015. Results show that risk perception seems be underestimated for all indicators considered. In particular scores of seismic Vulnerability factor are extremely low compared with house information data of the respondents. Other data collected by the questionnaire regard Earthquake information level, Sources of information, Earthquake occurrence with respect to other natural hazards, participation at risk reduction activities and level of involvement. Research on risk perception aims to aid risk analysis and policy-making by

  3. Analysis of Regolith Properties Using Seismic Signals Generated by InSight's HP3 Penetrator

    NASA Astrophysics Data System (ADS)

    Kedar, Sharon; Andrade, Jose; Banerdt, Bruce; Delage, Pierre; Golombek, Matt; Grott, Matthias; Hudson, Troy; Kiely, Aaron; Knapmeyer, Martin; Knapmeyer-Endrun, Brigitte; Krause, Christian; Kawamura, Taichi; Lognonne, Philippe; Pike, Tom; Ruan, Youyi; Spohn, Tilman; Teanby, Nick; Tromp, Jeroen; Wookey, James

    2017-07-01

    InSight's Seismic Experiment for Interior Structure (SEIS) provides a unique and unprecedented opportunity to conduct the first geotechnical survey of the Martian soil by taking advantage of the repeated seismic signals that will be generated by the mole of the Heat Flow and Physical Properties Package (HP3). Knowledge of the elastic properties of the Martian regolith have implications to material strength and can constrain models of water content, and provide context to geological processes and history that have acted on the landing site in western Elysium Planitia. Moreover, it will help to reduce travel-time errors introduced into the analysis of seismic data due to poor knowledge of the shallow subsurface. The challenge faced by the InSight team is to overcome the limited temporal resolution of the sharp hammer signals, which have significantly higher frequency content than the SEIS 100 Hz sampling rate. Fortunately, since the mole propagates at a rate of ˜1 mm per stroke down to 5 m depth, we anticipate thousands of seismic signals, which will vary very gradually as the mole travels. Using a combination of field measurements and modeling we simulate a seismic data set that mimics the InSight HP3-SEIS scenario, and the resolution of the InSight seismometer data. We demonstrate that the direct signal, and more importantly an anticipated reflected signal from the interface between the bottom of the regolith layer and an underlying lava flow, are likely to be observed both by Insight's Very Broad Band (VBB) seismometer and Short Period (SP) seismometer. We have outlined several strategies to increase the signal temporal resolution using the multitude of hammer stroke and internal timing information to stack and interpolate multiple signals, and demonstrated that in spite of the low resolution, the key parameters—seismic velocities and regolith depth—can be retrieved with a high degree of confidence.

  4. Stress and structure analysis of the Seismic Gap between the Wenchuan and Lushan Earthquakes

    NASA Astrophysics Data System (ADS)

    Liang, Chuntao

    2017-04-01

    An array of 20 short-period and 15 broadband seismometers were deployed to monitor the seismic gap between the 2008 Ms8.0 Wenchuan earthquake and the 2013 Ms7.0 Lushan earthquake. The Wenchuan earthquake ruptured from epicenter at (31.01°N, 103.42°E) largely northeastward while the Lushan earthquake ruptured from epicenter at (30.3°N, 103.0°E) largely southwestward. The region between the two earthquakes has recorded very few aftershocks and cataloged seismicity before and after the two big earthquakes compared to neighboring segments. As one small segment of the 500KM long Longmen Shan fault system, its absence of seismicity draws hot debate on whether a big one is still in brewing or steady creeping is in control of the strain energy release. The dense array is deployed primarily aimed to detect events that are much smaller than cataloged events and to determine if the segment is experiencing constantly creeping. The preliminary findings include: (1) source mechanisms show that the seismic gap appears to be a transitional zone between north and south segment. The events to the south are primarily thrust while events to north have more or less striking-slip components. This is also the case for both Lushan and Wenchuan earthquake; (2) The receiver function analysis shows that the Moho beneath the seismic Gap is less defined than its adjacent region with relatively weaker Ps conversion phases; (3) Both receiver function and ambient noise tomography show that the velocities in the upper crust is relatively lower in the Gap region than surrounding regions; (4) significant number of small earthquakes are located near surface in the gap region. Further examinations should be conducted before we can make a sounding conclusion on what mechanism is in control of the seismicity in this region.

  5. Seismic analysis of the large 70-meter antenna, part 1: Earthquake response spectra versus full transient analysis

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    As a check on structure safety aspects, two approaches in seismic analysis for the large 70-m antennas are presented. The first approach, commonly used by civil engineers, utilizes known recommended design response spectra. The second approach, which is the full transient analysis, is versatile and applicable not only to earthquake loading but also to other dynamic forcing functions. The results obtained at the fundamental structural frequency show that the two approaches are in good agreement with each other and both approaches show a safe design. The results also confirm past 64-m antenna seismic studies done by the Caltech Seismology Staff.

  6. Seismic analysis of the large 70-meter antenna, part 1: Earthquake response spectra versus full transient analysis

    NASA Technical Reports Server (NTRS)

    Kiedron, K.; Chian, C. T.

    1985-01-01

    As a check on structure safety aspects, two approaches in seismic analysis for the large 70-m antennas are presented. The first approach, commonly used by civil engineers, utilizes known recommended design response spectra. The second approach, which is the full transient analysis, is versatile and applicable not only to earthquake loading but also to other dynamic forcing functions. The results obtained at the fundamental structural frequency show that the two approaches are in good agreement with each other and both approaches show a safe design. The results also confirm past 64-m antenna seismic studies done by the Caltech Seismology Staff.

  7. HANFORD DOUBLE SHELL TANK (DST) THERMAL & SEISMIC PROJECT SEISMIC ANALYSIS IN SUPPORT OF INCREASED LIQUID LEVEL IN 241-AP TANK FARMS

    SciTech Connect

    MACKEY TC; ABBOTT FG; CARPENTER BG; RINKER MW

    2007-02-16

    The overall scope of the project is to complete an up-to-date comprehensive analysis of record of the DST System at Hanford. The "Double-Shell Tank (DST) Integrity Project - DST Thermal and Seismic Project" is in support of Tri-Party Agreement Milestone M-48-14.

  8. An analysis of seismic hazard in the Upper Rhine Graben enlightened by the example of the New Madrid seismic zone.

    NASA Astrophysics Data System (ADS)

    Doubre, Cécile; Masson, Frédéric; Mazzotti, Stéphane; Meghraoui, Mustapha

    2014-05-01

    Seismic hazard in the "stable" continental regions and low-level deformation zones is one of the most difficult issues to address in Earth sciences. In these zones, instrumental and historical seismicity are not well known (sparse seismic networks, seismic cycle too long to be covered by the human history, episodic seismic activity) and many active structures remain poorly characterized or unknown. This is the case of the Upper Rhine Graben, the central segment of the European Cenozoic rift system (ECRIS) of Oligocene age, which extends from the North Sea through Germany and France to the Mediterranean coast over a distance of some 1100 km. Even if this region has already experienced some destructive earthquakes, its present-day seismicity is moderate and the deformation observed by geodesy is very small (below the current measurement accuracy). The strain rate does not exceed 10-10 and paleoseismic studies indicate an average return period of 2.5 to 3 103 ka for large earthquakes. The largest earthquake known for this zone is the 1356 Basel earthquake, with a magnitude generally estimated about 6.5 (Meghraoui et al., 2001) but recently re-evaluated between 6.7 and 7.1 (Fäh et al et al., 2009). A comparison of the Upper Rhine Graben with equivalent regions around the world could help improve our evaluation of seismic hazard of this region. This is the case of the New Madrid seismic zone, one of the best studied intraplate system in central USA, which experienced an M 7.0 - 7.5 earthquake in 1811-1812 and shares several characteristics with the Upper Rhine Graben, i.e. the general framework of inherited geological structures (reactivation of a failed rift / graben), seismicity patterns (spatial variability of small and large earthquakes), the null or low rate of deformation, and the location in a "stable" continental interior. Looking at the Upper Rhine Graben as an analogue of the New Madrid seismic zone, we can re-evaluate its seismic hazard and consider the

  9. The whole modeling and structural seismic analysis of frame bridge in sluice based on Midas Civil

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Meng, Xiaoyi

    2017-04-01

    Frame Bridge is an important part of sluice that its structure safety has direct influence on the control and application of sluice. As demands for modern seismic design have increased, traditional static analysis cannot already satisfy the precise demand. On the basis of above considerations, a new structure analysis method of Frame Bridge is proposed based on Midas Civil. According to the typical structure of Frame Bridge, a whole modeling is established. Boundary constraints and loading conditions are setting reasonably. Based on the results of static analysis, dynamic analysis is proceeded by using standard ground motion parameters in software library. Internal forces under static and seismic condition are acquired that they can provide the scientific basis for physical design of sluice.

  10. Frequency Dependent Polarization Analysis of Ambient Seismic Noise Recorded at Broadband Seismometers

    NASA Astrophysics Data System (ADS)

    Koper, K.; Hawley, V.

    2010-12-01

    Analysis of ambient seismic noise is becoming increasingly relevant to modern seismology. Advances in computational speed and storage have made it feasible to analyze years and even decades of continuous seismic data in short amounts of time. Therefore, it is now possible to perform longitudinal studies of station performance in order to identify degradation or mis-installation of seismic equipment. Long-term noise analysis also provides insight into the evolution of the ocean wave climate, specifically whether the frequency and intensity of storms have changed as global temperatures have changed. Here we present a new approach to polarization analysis of seismic noise recorded by three-component seismometers. Essentially, eigen-decomposition of the 3-by-3 Hermitian spectral matrix associated with a sliding window of data is applied to yield various polarization attributes as a function of time and frequency. This in turn yields fundamental information about the composition of seismic noise, such as the extent to which it is polarized, its mode of propagation, and the direction from which it arrives at the seismometer. The polarization attributes can be viewed as function of time or binned over 2D frequency-time space to deduce regularities in the ambient noise that are unbiased by transient signals from earthquakes and explosions. We applied the algorithm to continuous data recorded in 2009 by the seismic station SLM, located in central North America. A rich variety of noise sources was observed. At low frequencies (<0.05 Hz) we observed a tilt-related signal that showed some elliptical motion in the horizontal plane. In the microseism band of 0.05-0.25 Hz, we observed Rayleigh energy arriving from the northeast, but with three distinct peaks instead of the classic single and double frequency peaks. At intermediate frequencies of 0.5-2.0 Hz, the noise was dominated by non-fundamental-mode Rayleigh energy, most likely P and Lg waves. At the highest frequencies (>3

  11. The R-package 'eseis' - towards a toolbox for comprehensive seismic data analysis

    NASA Astrophysics Data System (ADS)

    Dietze, Michael

    2015-04-01

    There are plenty of software solutions to process seismic data. However, most of these are either not free and open-source, are focused on specialised tasks, lack appropriate documentation/examples or are limited to command-line processing. R is the most widely used and still fastest growing scientific software worldwide. This free and open-source software allows contribution of user-build function packages (currently 6091) that cover nearly all scientific research fields. However, support of seismic data is only limited. This contribution is devoted to present the R-package 'eseis', a collection of functions to handle seismic data, mostly for but not limited to "environmental seismology", i.e. analysis of seismic signals, emitted by Earth surface processes such as landslides, rockfalls or debris flows. The package allows import/export/conversion of different data formats (cube, mseed, sac), signal processing (deconvolution, filtering, clipping/merging, power spectral density estimates), event handling (triggering, locating) and data visualisation (2D-plots, images, animations). The main advantages of using this package are the embedding of processed data in a huge framework of other scientific analysis approaches, the presence of a sound documentation and tested examples, benefit from a worldwide help and discussion network, the possibility to modify all functions and enlarge the functionality by the user.

  12. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    SciTech Connect

    Payne, Suzette Jackson; Coppersmith, Ryan; Coppersmith, Kevin; Rodriguez-Marek, Adrian; Falero, Valentina Montaldo; Youngs, Robert

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  13. Seismic slope-performance analysis: from hazard map to decision support system

    USGS Publications Warehouse

    Miles, Scott B.; Keefer, David K.; Ho, Carlton L.

    1999-01-01

    In response to the growing recognition of engineers and decision-makers of the regional effects of earthquake-induced landslides, this paper presents a general approach to conducting seismic landslide zonation, based on the popular Newmark's sliding block analogy for modeling coherent landslides. Four existing models based on the sliding block analogy are compared. The comparison shows that the models forecast notably different levels of slope performance. Considering this discrepancy along with the limitations of static maps as a decision tool, a spatial decision support system (SDSS) for seismic landslide analysis is proposed, which will support investigations over multiple scales for any number of earthquake scenarios and input conditions. Most importantly, the SDSS will allow use of any seismic landslide analysis model and zonation approach. Developments associated with the SDSS will produce an object-oriented model for encapsulating spatial data, an object-oriented specification to allow construction of models using modular objects, and a direct-manipulation, dynamic user-interface that adapts to the particular seismic landslide model configuration.

  14. First results of an ambient seismic noise analysis in western Corinth Gulf (Greece)

    NASA Astrophysics Data System (ADS)

    Giannopoulos, Dimitrios; Paraskevopoulos, Paraskevas; Sokos, Efthimios; Tselentis, G.-Akis

    2015-04-01

    We present the preliminary results of an ambient seismic noise analysis performed in the western Corinth Gulf, Greece. The Corinth Gulf is a continental rift which separates the central Greek mainland from Peloponnese. The rift is approximately 120 km long and 10-20 km wide, with a WNW-ESE orientation, extending from the Gulf of Patras in the west, to the Gulf of Alkionides in the east. It is considered as one of the most active extensional intra-continental rifts in the world, with the geodetically measured rates of extension varying from ~5 mm/yr at the eastern part, to ~15 mm/yr at the western part. We used data from three-component broad-band seismic stations operated under the framework of the Hellenic Unified Seismological Network (HUSN) and the Corinth Rift Laboratory (CRL). After the classical processing of continuous ambient seismic noise recordings, we used both auto-correlation and cross-correlation functions of single stations and station pairs, respectively, in order to retrieve empirical Green's functions (EGFs) of surface waves and estimate relative velocity changes. For estimating the relative velocity changes we used the moving-window cross spectrum analysis (MWCS) technique. This is the first attempt to characterize the ambient seismic noise properties in the area and study the possible relation between the detected relative velocity changes and the occurrence of moderate or strong earthquakes in the study area.

  15. Site specific seismic hazard analysis at the DOE Kansas City Plant

    SciTech Connect

    Lynch, D.T.; Drury, M.A.; Meis, R.C.; Bieniawski, A.; Savy, J.B.; Llopis, J.L.; Constantino, C.; Hashimoto, P.S.; Campbell, K.W.

    1995-10-01

    A site specific seismic hazard analysis is being conducted for the Kansas City Plant to support an on-going structural evaluation of existing buildings. This project is part of the overall review of facilities being conducted by DOE. The seismic hazard was probabilistically defined at the theoretical rock outcrop by Lawrence Livermore National Laboratory. The USArmy Engineer Waterways Experiment Station conducted a subsurface site investigation to characterize in situ S-wave velocities and other subsurface physical properties related to the geology in the vicinity of the Main Manufacturing Building (MMB) at the Bannister Federal Complex. The test program consisted of crosshole S-wave, seismic cone penetrometer testing,and laboratory soil analyses. The information acquired from this investigation was used in a site response analysis by City College of New York to determine the earthquake motion at grade. Ground response spectra appropriate for design and evaluation of Performance Category 1 and 2 structures, systems, and components were recommended. Effects of seismic loadings on the buildings will be used to aid in designing any structural modifications.

  16. Seismic Background Noise Analysis of Brtr (PS-43) Array

    NASA Astrophysics Data System (ADS)

    Bakir, M. E.; Meral Ozel, N.; Semin, K. U.

    2014-12-01

    The seismic background noise variation of BRTR array, composed of two sub arrays located in Ankara and in Kırıkkale-Keskin, has been investigated by calculating Power Spectral Density and Probability Density Functions for seasonal and diurnal noise variations between 2005 and 2011. PSDs were computed within the frequency range of 100 s - 10 Hz. The results show us a little change in noise conditions in terms of time and location. Especially, noise level changes were observed at 3-5 Hz in diurnal variations at Keskin array and there is a 5-7 dB difference in day and night time in cultural noise band (1-10 Hz). On the other hand, noise levels of medium period array are high in 1-2 Hz frequency range. High noise levels were observed in daily working times when we compared to night-time in cultural noise band. The seasonal background noise variation at both sites also shows very similar properties to each other. Since both arrays consist ofborehole instruments and away from the coasts, we saw a small change in noise levels caused by microseism. Comparison between Keskin short period array and Ankara medium period array show us Keskin array is quiter than Ankara array.

  17. Seismic Background Noise Analysis of BRTR (PS-43) Array

    NASA Astrophysics Data System (ADS)

    Ezgi Bakir, Mahmure; Meral Ozel, Nurcan; Umut Semin, Korhan

    2015-04-01

    The seismic background noise variation of BRTR array, composed of two sub arrays located in Ankara and in Ankara-Keskin, has been investigated by calculating Power Spectral Density and Probability Density Functions for seasonal and diurnal noise variations between 2005 and 2011. PSDs were computed within the frequency range of 100 s - 10 Hz. The results show us a little change in noise conditions in terms of time and location. Especially, noise level changes were observed at 3-5 Hz in diurnal variations at Keskin array and there is a 5-7 dB difference in day and night time in cultural noise band (1-10 Hz). On the other hand, noise levels of medium period array is high in 1-2 Hz frequency rather than short period array. High noise levels were observed in daily working times when we compare it to night-time in cultural noise band. The seasonal background noise variation at both sites also shows very similar properties to each other. Since these stations are borehole instruments and away from the coasts, we saw a small change in noise levels caused by microseism. Comparison between Keskin short period array and Ankara medium period array show us Keskin array is quiter than Ankara array.

  18. Large-scale seismic signal analysis with Hadoop

    DOE PAGES

    Addair, T. G.; Dodge, D. A.; Walter, W. R.; ...

    2014-02-11

    In seismology, waveform cross correlation has been used for years to produce high-precision hypocenter locations and for sensitive detectors. Because correlated seismograms generally are found only at small hypocenter separation distances, correlation detectors have historically been reserved for spotlight purposes. However, many regions have been found to produce large numbers of correlated seismograms, and there is growing interest in building next-generation pipelines that employ correlation as a core part of their operation. In an effort to better understand the distribution and behavior of correlated seismic events, we have cross correlated a global dataset consisting of over 300 million seismograms. Thismore » was done using a conventional distributed cluster, and required 42 days. In anticipation of processing much larger datasets, we have re-architected the system to run as a series of MapReduce jobs on a Hadoop cluster. In doing so we achieved a factor of 19 performance increase on a test dataset. We found that fundamental algorithmic transformations were required to achieve the maximum performance increase. Whereas in the original IO-bound implementation, we went to great lengths to minimize IO, in the Hadoop implementation where IO is cheap, we were able to greatly increase the parallelism of our algorithms by performing a tiered series of very fine-grained (highly parallelizable) transformations on the data. Each of these MapReduce jobs required reading and writing large amounts of data.« less

  19. A Comparison of seismic instrument noise coherence analysis techniques

    USGS Publications Warehouse

    Ringler, A.T.; Hutt, C.R.; Evans, J.R.; Sandoval, L.D.

    2011-01-01

    The self-noise of a seismic instrument is a fundamental characteristic used to evaluate the quality of the instrument. It is important to be able to measure this self-noise robustly, to understand how differences among test configurations affect the tests, and to understand how different processing techniques and isolation methods (from nonseismic sources) can contribute to differences in results. We compare two popular coherence methods used for calculating incoherent noise, which is widely used as an estimate of instrument self-noise (incoherent noise and self-noise are not strictly identical but in observatory practice are approximately equivalent; Holcomb, 1989; Sleeman et al., 2006). Beyond directly comparing these two coherence methods on similar models of seismometers, we compare how small changes in test conditions can contribute to incoherent-noise estimates. These conditions include timing errors, signal-to-noise ratio changes (ratios between background noise and instrument incoherent noise), relative sensor locations, misalignment errors, processing techniques, and different configurations of sensor types.

  20. LAVA (Los Alamos Vulnerability and Risk Assessment Methodology): A conceptual framework for automated risk analysis

    SciTech Connect

    Smith, S.T.; Lim, J.J.; Phillips, J.R.; Tisinger, R.M.; Brown, D.C.; FitzGerald, P.D.

    1986-01-01

    At Los Alamos National Laboratory, we have developed an original methodology for performing risk analyses on subject systems characterized by a general set of asset categories, a general spectrum of threats, a definable system-specific set of safeguards protecting the assets from the threats, and a general set of outcomes resulting from threats exploiting weaknesses in the safeguards system. The Los Alamos Vulnerability and Risk Assessment Methodology (LAVA) models complex systems having large amounts of ''soft'' information about both the system itself and occurrences related to the system. Its structure lends itself well to automation on a portable computer, making it possible to analyze numerous similar but geographically separated installations consistently and in as much depth as the subject system warrants. LAVA is based on hierarchical systems theory, event trees, fuzzy sets, natural-language processing, decision theory, and utility theory. LAVA's framework is a hierarchical set of fuzzy event trees that relate the results of several embedded (or sub-) analyses: a vulnerability assessment providing information about the presence and efficacy of system safeguards, a threat analysis providing information about static (background) and dynamic (changing) threat components coupled with an analysis of asset ''attractiveness'' to the dynamic threat, and a consequence analysis providing information about the outcome spectrum's severity measures and impact values. By using LAVA, we have modeled our widely used computer security application as well as LAVA/CS systems for physical protection, transborder data flow, contract awards, and property management. It is presently being applied for modeling risk management in embedded systems, survivability systems, and weapons systems security. LAVA is especially effective in modeling subject systems that include a large human component.

  1. Vulnerability assessment of medieval civic towers as a tool for retrofitting design

    SciTech Connect

    Casciati, Sara; Faravelli, Lucia

    2008-07-08

    The seismic vulnerability of an ancient civic bell-tower is studied. Rather than seeing it as an intermediate stage toward a risk analysis, the assessment of vulnerability is here pursued for the purpose of optimizing the retrofit design. The vulnerability curves are drawn by carrying out a single time history analysis of a model calibrated on the basis of experimental data. From the results of this analysis, the medians of three selected performance parameters are estimated, and they are used to compute, for each of them, the probability of exceeding or attaining the three corresponding levels of light, moderate and severe damage. The same numerical model is then used to incorporate the effects of several retrofitting solutions and to re-estimate the associated vulnerability curves. The ultimate goal is to provide a numerical tool able to drive the optimization process of a retrofit design by the comparison of the vulnerability estimates associated with the different retrofitting solutions.

  2. Large-Scale Numerical Analysis of Three-Dimensional Seismic Waves.

    DTIC Science & Technology

    1984-05-31

    400,000 nodes) from Yucca Flat , Nevada Test Site. Analysis is based on an explicit, finite element, elastic wave solver designed for fully vectorized... waves through a detailed 3-D inhomogeneous model in Yucca Flat would enable us to quantify the strength of the diffracted transverse motion...ANALYSIS OF THREE-DIMENSIONAL SEISMIC WAVES 0 L By G. L. Wojcik D. K. Vaughan Prepared for: Air Force Office of Scientific Research Boiling AFB

  3. Seismic fragility analysis for geostructures using ANN-based response surface

    NASA Astrophysics Data System (ADS)

    Park, N. S.; Cho, S. E.

    2016-12-01

    Seismic fragility curve is an effective tool to predict the degree of damages to the structure probabilistically under seismic load. When the seismic fragility curve is to be prepared in general structures such as bridges or concrete structures, the seismic load is put as the random variable and then the fragility curve is established. However, in the case of the geostructures such as the cut slope and soil levee, there are uncertainties in the related geotechnical parameters. Therefore, they should be interpreted by considering the uncertainties. In this study, seismic fragility curves for levee and slope were prepared considering the uncertainty in the geotechnical parameter and using the pseudostatic analysis. For the probabilistic analysis, Monte Carlo Simulation(MCS) method was used based on the coefficient of variation(COV) provided from the previous studies. As far as MCS method is concerned, the number of simulation shall be increased to get a certain degree of reliability when the probability of failure is low. In this process, MCS method is unfavorable because it requires more time and expenses. To overcome these shortcomings, the response surface method using the artificial neural network(ANN) that improves the efficiency in preparing the fragility curve was applied. For the review of the applicability, the results were compared with the MCS-based fragility curves. In addition, fragility curves that depend on the variation of water level of levee were prepared using the ANN-based response surface. The results showed that the new method can get the fragility curve which is similar to the MCS-based fragility curve, and can be efficiently used to reduce the analysis time. Acknowledgements: This research was supported by the Korea Agency for Infrastructure Technology Advancement (KAIA) with funding from the Ministry of Land, Infrastructure and Transport of the Korean government (16SCIP-B065985-04).

  4. Analysis of bathymetric surveys to identify coastal vulnerabilities at Cape Canaveral, Florida

    USGS Publications Warehouse

    Thompson, David M.; Plant, Nathaniel G.; Hansen, Mark E.

    2015-10-07

    The purpose of this work is to describe an updated bathymetric dataset collected in 2014 and compare it to previous datasets. The updated data focus on the bathymetric features and sediment transport pathways that connect the offshore regions to the shoreline and, therefore, are related to the protection of other portions of the coastal environment, such as dunes, that support infrastructure and ecosystems. Previous survey data include National Oceanic and Atmospheric Administration’s (NOAA) National Ocean Service (NOS) hydrographic survey from 1956 and a USGS survey from 2010 that is augmented with NOS surveys from 2006 and 2007. The primary result of this analysis is documentation and quantification of the nature and rates of bathymetric changes that are near (within about 2.5 km) the current Cape Canaveral shoreline and interpretation of the impact of these changes on future erosion vulnerability.

  5. Fuzzy decision analysis for integrated environmental vulnerability assessment of the mid-Atlantic Region.

    PubMed

    Tran, Liem T; Knight, C Gregory; O'Neill, Robert V; Smith, Elizabeth R; Riitters, Kurt H; Wickham, James

    2002-06-01

    A fuzzy decision analysis method for integrating ecological indicators was developed. This was a combination of a fuzzy ranking method and the analytic hierarchy process (AHP). The method was capable of ranking ecosystems in terms of environmental conditions and suggesting cumulative impacts across a large region. Using data on land cover, population, roads, streams, air pollution, and topography of the Mid-Atlantic region, we were able to point out areas that were in relatively poor condition and/or vulnerable to future deterioration. The method offered an easy and comprehensive way to combine the strengths of fuzzy set theory and the AHP for ecological assessment. Furthermore, the suggested method can serve as a building block for the evaluation of environmental policies.

  6. Development of adaptive seismic isolators for ultimate seismic protection of civil structures

    NASA Astrophysics Data System (ADS)

    Li, Jianchun; Li, Yancheng; Li, Weihua; Samali, Bijan

    2013-04-01

    Base isolation is the most popular seismic protection technique for civil engineering structures. However, research has revealed that the traditional base isolation system due to its passive nature is vulnerable to two kinds of earthquakes, i.e. the near-fault and far-fault earthquakes. A great deal of effort has been dedicated to improve the performance of the traditional base isolation system for these two types of earthquakes. This paper presents a recent research breakthrough on the development of a novel adaptive seismic isolation system as the quest for ultimate protection for civil structures, utilizing the field-dependent property of the magnetorheological elastomer (MRE). A novel adaptive seismic isolator was developed as the key element to form smart seismic isolation system. The novel isolator contains unique laminated structure of steel and MR elastomer layers, which enable its large-scale civil engineering applications, and a solenoid to provide sufficient and uniform magnetic field for energizing the field-dependent property of MR elastomers. With the controllable shear modulus/damping of the MR elastomer, the developed adaptive seismic isolator possesses a controllable lateral stiffness while maintaining adequate vertical loading capacity. In this paper, a comprehensive review on the development of the adaptive seismic isolator is present including designs, analysis and testing of two prototypical adaptive seismic isolators utilizing two different MRE materials. Experimental results show that the first prototypical MRE seismic isolator can provide stiffness increase up to 37.49%, while the second prototypical MRE seismic isolator provides amazing increase of lateral stiffness up to1630%. Such range of increase of the controllable stiffness of the seismic isolator makes it highly practical for developing new adaptive base isolation system utilizing either semi-active or smart passive controls.

  7. Large-scale seismic signal analysis with Hadoop

    NASA Astrophysics Data System (ADS)

    Addair, T. G.; Dodge, D. A.; Walter, W. R.; Ruppert, S. D.

    2014-05-01

    In seismology, waveform cross correlation has been used for years to produce high-precision hypocenter locations and for sensitive detectors. Because correlated seismograms generally are found only at small hypocenter separation distances, correlation detectors have historically been reserved for spotlight purposes. However, many regions have been found to produce large numbers of correlated seismograms, and there is growing interest in building next-generation pipelines that employ correlation as a core part of their operation. In an effort to better understand the distribution and behavior of correlated seismic events, we have cross correlated a global dataset consisting of over 300 million seismograms. This was done using a conventional distributed cluster, and required 42 days. In anticipation of processing much larger datasets, we have re-architected the system to run as a series of MapReduce jobs on a Hadoop cluster. In doing so we achieved a factor of 19 performance increase on a test dataset. We found that fundamental algorithmic transformations were required to achieve the maximum performance increase. Whereas in the original IO-bound implementation, we went to great lengths to minimize IO, in the Hadoop implementation where IO is cheap, we were able to greatly increase the parallelism of our algorithms by performing a tiered series of very fine-grained (highly parallelizable) transformations on the data. Each of these MapReduce jobs required reading and writing large amounts of data. But, because IO is very fast, and because the fine-grained computations could be handled extremely quickly by the mappers, the net was a large performance gain.

  8. Seismic Hazard Analysis Using the Adaptive Kernel Density Estimation Technique for Chennai City

    NASA Astrophysics Data System (ADS)

    Ramanna, C. K.; Dodagoudar, G. R.

    2012-01-01

    Conventional method of probabilistic seismic hazard analysis (PSHA) using the Cornell-McGuire approach requires identification of homogeneous source zones as the first step. This criterion brings along many issues and, hence, several alternative methods to hazard estimation have come up in the last few years. Methods such as zoneless or zone-free methods, modelling of earth's crust using numerical methods with finite element analysis, have been proposed. Delineating a homogeneous source zone in regions of distributed seismicity and/or diffused seismicity is rather a difficult task. In this study, the zone-free method using the adaptive kernel technique to hazard estimation is explored for regions having distributed and diffused seismicity. Chennai city is in such a region with low to moderate seismicity so it has been used as a case study. The adaptive kernel technique is statistically superior to the fixed kernel technique primarily because the bandwidth of the kernel is varied spatially depending on the clustering or sparseness of the epicentres. Although the fixed kernel technique has proven to work well in general density estimation cases, it fails to perform in the case of multimodal and long tail distributions. In such situations, the adaptive kernel technique serves the purpose and is more relevant in earthquake engineering as the activity rate probability density surface is multimodal in nature. The peak ground acceleration (PGA) obtained from all the three approaches (i.e., the Cornell-McGuire approach, fixed kernel and adaptive kernel techniques) for 10% probability of exceedance in 50 years is around 0.087 g. The uniform hazard spectra (UHS) are also provided for different structural periods.

  9. Nonlinear Seismic Correlation Analysis of the JNES/NUPEC Large-Scale Piping System Tests.

    SciTech Connect

    Nie,J.; DeGrassi, G.; Hofmayer, C.; Ali, S.

    2008-06-01

    The Japan Nuclear Energy Safety Organization/Nuclear Power Engineering Corporation (JNES/NUPEC) large-scale piping test program has provided valuable new test data on high level seismic elasto-plastic behavior and failure modes for typical nuclear power plant piping systems. The component and piping system tests demonstrated the strain ratcheting behavior that is expected to occur when a pressurized pipe is subjected to cyclic seismic loading. Under a collaboration agreement between the US and Japan on seismic issues, the US Nuclear Regulatory Commission (NRC)/Brookhaven National Laboratory (BNL) performed a correlation analysis of the large-scale piping system tests using derailed state-of-the-art nonlinear finite element models. Techniques are introduced to develop material models that can closely match the test data. The shaking table motions are examined. The analytical results are assessed in terms of the overall system responses and the strain ratcheting behavior at an elbow. The paper concludes with the insights about the accuracy of the analytical methods for use in performance assessments of highly nonlinear piping systems under large seismic motions.

  10. Analysis of Seismic Activity of the last 15 Years Nearby Puerto Rico and Caribbean Region.

    NASA Astrophysics Data System (ADS)

    Huerta-Lopez, C. I.; Torres-Ortíz, D. M.; Fernández-Heredia, A. I.; Martínez-Cruzado, J. A.

    2015-12-01

    An earthquake catalog of the seismicity occurred during the last 15 years in the Caribbean region, nearby the vicinity of Puerto Rico Island (PRI) was compiled in order to capture the big picture of the regional seismic activity ratio and in particular at the epicentral regions of several historical and instrumentally recorded (during 2008-20015) large to moderate magnitude earthquakes occurred nearby PRI in onshore and offshore, which include the M6.4 earthquake of 01/13/2014, the largest earthquake recorded instrumentally nearby PRI. From the point of view of joint temporal-spatial distribution of epicenters, episodic temporal-spatial seismic activity is clearly seen as temporal-spatial concentrations during certain time intervals in different regions. These localized concentrations of epicenters that occur during certain time intervals in well localized/concentrated regions may suggest "seismic gaps" that shows no regular time interval, neither spatial pattern. In the epicentral region of the M6.4 01/13/2014 earthquake and the historical Mona Passage M7.5 earthquake of 10/11/1918, episodic concentrations in time and space of small magnitude earthquakes epicenters is evident, however do not show temporal pattern. Preliminary results of statistical analysis of an ongoing research in terms of the parameter b (Gutenberg-Richter relationship), and the Omori's law with the aim to relate the tectonic framework of the region (or sub-regions) such as structural heterogeneity stress are here presented/discussed.

  11. Areal distribution of sedimentary facies determined from seismic facies analysis and models of modern depositional systems

    SciTech Connect

    Seramur, K.C.; Powell, R.D.; Carpenter, P.J.

    1988-01-01

    Seismic facies analysis was applied to 3.5-kHz single-channel analog reflection profiles of the sediment fill within Muir Inlet, Glacier Bay, southeast Alaska. Nine sedimentary facies have been interpreted from seven seismic facies identified on the profiles. The interpretations are based on reflection characteristics and structural features of the seismic facies. The following reflection characteristics and structural features are used: reflector spacing, amplitude and continuity of reflections, internal reflection configurations, attitude of reflection terminations at a facies boundary, body geometry of a facies, and the architectural associations of seismic facies within each basin. The depositional systems are reconstructed by determining the paleotopography, bedding patterns, sedimentary facies, and modes of deposition within the basin. Muir Inlet is a recently deglaciated fjord for which succe